Dec 09 03:11:57 crc systemd[1]: Starting Kubernetes Kubelet... Dec 09 03:11:57 crc restorecon[4744]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:57 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 03:11:58 crc restorecon[4744]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 09 03:11:58 crc kubenswrapper[4766]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 03:11:58 crc kubenswrapper[4766]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 09 03:11:58 crc kubenswrapper[4766]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 03:11:58 crc kubenswrapper[4766]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 03:11:58 crc kubenswrapper[4766]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 09 03:11:58 crc kubenswrapper[4766]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.666050 4766 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672256 4766 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672279 4766 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672285 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672291 4766 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672297 4766 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672305 4766 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672314 4766 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672320 4766 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672326 4766 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672331 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672336 4766 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672342 4766 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672347 4766 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672352 4766 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672357 4766 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672362 4766 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672367 4766 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672372 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672378 4766 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672383 4766 feature_gate.go:330] unrecognized feature gate: Example Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672388 4766 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672394 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672401 4766 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672407 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672413 4766 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672419 4766 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672425 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672430 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672449 4766 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672456 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672463 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672468 4766 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672501 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672511 4766 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672518 4766 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672524 4766 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672529 4766 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672534 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672540 4766 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672555 4766 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672561 4766 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672566 4766 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672573 4766 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672579 4766 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672584 4766 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672589 4766 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672595 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672601 4766 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672606 4766 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672611 4766 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672616 4766 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672621 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672626 4766 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672631 4766 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672647 4766 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672653 4766 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672658 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672664 4766 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672669 4766 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672674 4766 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672679 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672684 4766 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672689 4766 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672695 4766 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672700 4766 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672714 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672720 4766 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672726 4766 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672731 4766 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672736 4766 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.672742 4766 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673086 4766 flags.go:64] FLAG: --address="0.0.0.0" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673103 4766 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673115 4766 flags.go:64] FLAG: --anonymous-auth="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673123 4766 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673131 4766 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673137 4766 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673145 4766 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673153 4766 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673160 4766 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673166 4766 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673173 4766 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673180 4766 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673186 4766 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673192 4766 flags.go:64] FLAG: --cgroup-root="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673198 4766 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673205 4766 flags.go:64] FLAG: --client-ca-file="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673232 4766 flags.go:64] FLAG: --cloud-config="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673239 4766 flags.go:64] FLAG: --cloud-provider="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673245 4766 flags.go:64] FLAG: --cluster-dns="[]" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673252 4766 flags.go:64] FLAG: --cluster-domain="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673257 4766 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673263 4766 flags.go:64] FLAG: --config-dir="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673269 4766 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673276 4766 flags.go:64] FLAG: --container-log-max-files="5" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673293 4766 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673299 4766 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673305 4766 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673312 4766 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673318 4766 flags.go:64] FLAG: --contention-profiling="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673324 4766 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673330 4766 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673336 4766 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673342 4766 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673351 4766 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673357 4766 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673362 4766 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673368 4766 flags.go:64] FLAG: --enable-load-reader="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673375 4766 flags.go:64] FLAG: --enable-server="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673381 4766 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673390 4766 flags.go:64] FLAG: --event-burst="100" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673397 4766 flags.go:64] FLAG: --event-qps="50" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673403 4766 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673409 4766 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673415 4766 flags.go:64] FLAG: --eviction-hard="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673422 4766 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673428 4766 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673435 4766 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673450 4766 flags.go:64] FLAG: --eviction-soft="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673456 4766 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673462 4766 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673469 4766 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673474 4766 flags.go:64] FLAG: --experimental-mounter-path="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673480 4766 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673486 4766 flags.go:64] FLAG: --fail-swap-on="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673492 4766 flags.go:64] FLAG: --feature-gates="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673506 4766 flags.go:64] FLAG: --file-check-frequency="20s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673512 4766 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673518 4766 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673527 4766 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673533 4766 flags.go:64] FLAG: --healthz-port="10248" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673540 4766 flags.go:64] FLAG: --help="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673546 4766 flags.go:64] FLAG: --hostname-override="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673552 4766 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673558 4766 flags.go:64] FLAG: --http-check-frequency="20s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673564 4766 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673570 4766 flags.go:64] FLAG: --image-credential-provider-config="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673576 4766 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673582 4766 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673588 4766 flags.go:64] FLAG: --image-service-endpoint="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673596 4766 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673602 4766 flags.go:64] FLAG: --kube-api-burst="100" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673608 4766 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673615 4766 flags.go:64] FLAG: --kube-api-qps="50" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673621 4766 flags.go:64] FLAG: --kube-reserved="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673627 4766 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673633 4766 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673639 4766 flags.go:64] FLAG: --kubelet-cgroups="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673645 4766 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673651 4766 flags.go:64] FLAG: --lock-file="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673657 4766 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673663 4766 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673669 4766 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673682 4766 flags.go:64] FLAG: --log-json-split-stream="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673694 4766 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673701 4766 flags.go:64] FLAG: --log-text-split-stream="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673707 4766 flags.go:64] FLAG: --logging-format="text" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673712 4766 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673719 4766 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673725 4766 flags.go:64] FLAG: --manifest-url="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673730 4766 flags.go:64] FLAG: --manifest-url-header="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673742 4766 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673748 4766 flags.go:64] FLAG: --max-open-files="1000000" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673757 4766 flags.go:64] FLAG: --max-pods="110" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673762 4766 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673768 4766 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673774 4766 flags.go:64] FLAG: --memory-manager-policy="None" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673780 4766 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673787 4766 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673793 4766 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673800 4766 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673814 4766 flags.go:64] FLAG: --node-status-max-images="50" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673822 4766 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673829 4766 flags.go:64] FLAG: --oom-score-adj="-999" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673835 4766 flags.go:64] FLAG: --pod-cidr="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673841 4766 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673851 4766 flags.go:64] FLAG: --pod-manifest-path="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673857 4766 flags.go:64] FLAG: --pod-max-pids="-1" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673864 4766 flags.go:64] FLAG: --pods-per-core="0" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673870 4766 flags.go:64] FLAG: --port="10250" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673882 4766 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673888 4766 flags.go:64] FLAG: --provider-id="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673894 4766 flags.go:64] FLAG: --qos-reserved="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673901 4766 flags.go:64] FLAG: --read-only-port="10255" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673907 4766 flags.go:64] FLAG: --register-node="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673913 4766 flags.go:64] FLAG: --register-schedulable="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673920 4766 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673930 4766 flags.go:64] FLAG: --registry-burst="10" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673936 4766 flags.go:64] FLAG: --registry-qps="5" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673942 4766 flags.go:64] FLAG: --reserved-cpus="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673949 4766 flags.go:64] FLAG: --reserved-memory="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673957 4766 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673963 4766 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673972 4766 flags.go:64] FLAG: --rotate-certificates="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673978 4766 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673984 4766 flags.go:64] FLAG: --runonce="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673990 4766 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.673997 4766 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674003 4766 flags.go:64] FLAG: --seccomp-default="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674009 4766 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674015 4766 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674021 4766 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674028 4766 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674034 4766 flags.go:64] FLAG: --storage-driver-password="root" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674040 4766 flags.go:64] FLAG: --storage-driver-secure="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674046 4766 flags.go:64] FLAG: --storage-driver-table="stats" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674052 4766 flags.go:64] FLAG: --storage-driver-user="root" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674058 4766 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674065 4766 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674072 4766 flags.go:64] FLAG: --system-cgroups="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674078 4766 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674088 4766 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674095 4766 flags.go:64] FLAG: --tls-cert-file="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674101 4766 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674109 4766 flags.go:64] FLAG: --tls-min-version="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674115 4766 flags.go:64] FLAG: --tls-private-key-file="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674121 4766 flags.go:64] FLAG: --topology-manager-policy="none" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674127 4766 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674133 4766 flags.go:64] FLAG: --topology-manager-scope="container" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674140 4766 flags.go:64] FLAG: --v="2" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674151 4766 flags.go:64] FLAG: --version="false" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674159 4766 flags.go:64] FLAG: --vmodule="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674167 4766 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674174 4766 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674343 4766 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674351 4766 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674360 4766 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674366 4766 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674372 4766 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674378 4766 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674383 4766 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674389 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674395 4766 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674400 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674406 4766 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674411 4766 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674417 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674424 4766 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674431 4766 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674437 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674442 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674448 4766 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674454 4766 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674460 4766 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674466 4766 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674472 4766 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674477 4766 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674483 4766 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674489 4766 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674494 4766 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674499 4766 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674504 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674520 4766 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674525 4766 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674531 4766 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674536 4766 feature_gate.go:330] unrecognized feature gate: Example Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674541 4766 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674547 4766 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674552 4766 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674558 4766 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674563 4766 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674568 4766 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674575 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674580 4766 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674585 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674600 4766 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674607 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674613 4766 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674619 4766 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674625 4766 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674630 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674636 4766 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674642 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674651 4766 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674665 4766 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674673 4766 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674679 4766 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674686 4766 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674693 4766 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674700 4766 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674706 4766 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674712 4766 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674718 4766 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674723 4766 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674728 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674734 4766 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674740 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674745 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674751 4766 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674756 4766 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674762 4766 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674767 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674773 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674779 4766 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.674784 4766 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.674793 4766 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.687308 4766 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.687340 4766 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687429 4766 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687445 4766 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687452 4766 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687458 4766 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687465 4766 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687472 4766 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687477 4766 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687482 4766 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687487 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687493 4766 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687498 4766 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687503 4766 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687508 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687513 4766 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687518 4766 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687524 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687529 4766 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687535 4766 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687541 4766 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687547 4766 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687553 4766 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687560 4766 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687566 4766 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687571 4766 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687577 4766 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687582 4766 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687587 4766 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687593 4766 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687598 4766 feature_gate.go:330] unrecognized feature gate: Example Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687604 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687609 4766 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687616 4766 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687623 4766 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687629 4766 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687636 4766 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687643 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687650 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687656 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687662 4766 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687668 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687673 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687679 4766 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687684 4766 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687689 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687695 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687700 4766 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687705 4766 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687710 4766 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687715 4766 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687720 4766 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687726 4766 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687731 4766 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687736 4766 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687743 4766 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687748 4766 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687753 4766 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687759 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687764 4766 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687771 4766 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687777 4766 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687782 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687789 4766 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687795 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687802 4766 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687807 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687813 4766 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687830 4766 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687836 4766 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687842 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687848 4766 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.687855 4766 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.687864 4766 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688042 4766 feature_gate.go:330] unrecognized feature gate: Example Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688053 4766 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688061 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688068 4766 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688074 4766 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688081 4766 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688087 4766 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688093 4766 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688098 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688104 4766 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688109 4766 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688114 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688119 4766 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688125 4766 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688131 4766 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688136 4766 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688141 4766 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688147 4766 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688152 4766 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688158 4766 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688163 4766 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688168 4766 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688173 4766 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688178 4766 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688183 4766 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688188 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688193 4766 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688198 4766 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688203 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688208 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688240 4766 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688249 4766 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688257 4766 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688263 4766 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688272 4766 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688279 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688285 4766 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688291 4766 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688298 4766 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688304 4766 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688309 4766 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688314 4766 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688319 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688326 4766 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688332 4766 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688339 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688345 4766 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688350 4766 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688356 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688361 4766 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688366 4766 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688371 4766 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688377 4766 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688382 4766 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688387 4766 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688393 4766 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688398 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688403 4766 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688408 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688414 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688419 4766 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688424 4766 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688429 4766 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688434 4766 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688440 4766 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688445 4766 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688450 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688455 4766 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688460 4766 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688465 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.688471 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.688478 4766 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.688672 4766 server.go:940] "Client rotation is on, will bootstrap in background" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.692031 4766 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.692132 4766 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.692804 4766 server.go:997] "Starting client certificate rotation" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.692838 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.693313 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-07 14:43:45.063458824 +0000 UTC Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.693522 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.705635 4766 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.707950 4766 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.708987 4766 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.718578 4766 log.go:25] "Validated CRI v1 runtime API" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.742823 4766 log.go:25] "Validated CRI v1 image API" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.744723 4766 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.747181 4766 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-09-03-07-18-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.747235 4766 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.762305 4766 manager.go:217] Machine: {Timestamp:2025-12-09 03:11:58.759771675 +0000 UTC m=+0.469077121 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2701ecc7-fbfe-4321-9495-f77bb9b59c76 BootID:45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d7:e6:d4 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d7:e6:d4 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:fa:1b:29 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d5:cd:d8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a1:f1:5f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:fa:c3:3a Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:1c:d7:75 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:46:3d:c4:75:78:15 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:31:d1:f7:ed:94 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.762525 4766 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.762724 4766 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.763018 4766 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.763170 4766 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.763221 4766 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.763449 4766 topology_manager.go:138] "Creating topology manager with none policy" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.763461 4766 container_manager_linux.go:303] "Creating device plugin manager" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.763772 4766 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.764668 4766 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.765073 4766 state_mem.go:36] "Initialized new in-memory state store" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.765192 4766 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.766295 4766 kubelet.go:418] "Attempting to sync node with API server" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.766324 4766 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.766359 4766 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.766385 4766 kubelet.go:324] "Adding apiserver pod source" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.766404 4766 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.768456 4766 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.769116 4766 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.769490 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.769547 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.769605 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.769631 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.770661 4766 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771351 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771381 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771391 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771402 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771429 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771459 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771469 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771501 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771514 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771525 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771623 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.771636 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.772969 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.773574 4766 server.go:1280] "Started kubelet" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.773633 4766 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.773799 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.773919 4766 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.775906 4766 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 09 03:11:58 crc systemd[1]: Started Kubernetes Kubelet. Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.777612 4766 server.go:460] "Adding debug handlers to kubelet server" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.778749 4766 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.779275 4766 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.779911 4766 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.779945 4766 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.780123 4766 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.780695 4766 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.782243 4766 factory.go:55] Registering systemd factory Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.782270 4766 factory.go:221] Registration of the systemd container factory successfully Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.782953 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.783063 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.783053 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="200ms" Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.779546 4766 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187f6d72cc48b71a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 03:11:58.773540634 +0000 UTC m=+0.482846070,LastTimestamp:2025-12-09 03:11:58.773540634 +0000 UTC m=+0.482846070,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.783465 4766 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 03:44:37.28711294 +0000 UTC Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.783537 4766 factory.go:153] Registering CRI-O factory Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.783578 4766 factory.go:221] Registration of the crio container factory successfully Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.783788 4766 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.784336 4766 factory.go:103] Registering Raw factory Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.784385 4766 manager.go:1196] Started watching for new ooms in manager Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.789271 4766 manager.go:319] Starting recovery of all containers Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793378 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793429 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793445 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793459 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793475 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793489 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793508 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793522 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793538 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793552 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793565 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793579 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793611 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793628 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793641 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793656 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793670 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793684 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793697 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793712 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793726 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793759 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793774 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793789 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793803 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793817 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793834 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793848 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793862 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793876 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793890 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793982 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.793996 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794010 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794026 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794039 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794053 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794067 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794081 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794094 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794108 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794123 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794137 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794150 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794163 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794177 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794191 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794209 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794241 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794255 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794273 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794288 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794308 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794596 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794625 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794643 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794658 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794673 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794689 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794703 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794716 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794730 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.794799 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796436 4766 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796522 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796579 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796612 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796647 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796677 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796703 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796729 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796752 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796776 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796799 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796822 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796842 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796865 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796891 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796912 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796933 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796955 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.796980 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797008 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797030 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797057 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797079 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797101 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797122 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797143 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797164 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797187 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797262 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797286 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797308 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797340 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797364 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797387 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797410 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797440 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797464 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797488 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797511 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797534 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797557 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797579 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797624 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797662 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797694 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797722 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797747 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797773 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797804 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797830 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797856 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797882 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797913 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797967 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.797996 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798018 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798042 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798066 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798091 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798116 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798137 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798159 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798188 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798240 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798268 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798292 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798315 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798337 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798360 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798383 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798406 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798429 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798452 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798479 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798500 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798527 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798560 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798591 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798613 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798635 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798659 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798684 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798711 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798738 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798794 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798815 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798837 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798858 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798879 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798900 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798923 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798944 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798965 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.798988 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799011 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799034 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799059 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799081 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799102 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799125 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799146 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799168 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799192 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799241 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799264 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799287 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799308 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799333 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799354 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799375 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799395 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799415 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799470 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799494 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799516 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799539 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799560 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799581 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799602 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799625 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799645 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799667 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799689 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799711 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799733 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799754 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799774 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799794 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799819 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799840 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799862 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799885 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799906 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799932 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799954 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799978 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.799998 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800020 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800045 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800066 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800092 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800123 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800146 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800167 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800193 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800254 4766 reconstruct.go:97] "Volume reconstruction finished" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.800271 4766 reconciler.go:26] "Reconciler: start to sync state" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.817320 4766 manager.go:324] Recovery completed Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.835690 4766 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.837803 4766 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.837862 4766 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.837902 4766 kubelet.go:2335] "Starting kubelet main sync loop" Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.837981 4766 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 09 03:11:58 crc kubenswrapper[4766]: W1209 03:11:58.839478 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.839612 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.839736 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.841858 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.841914 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.841931 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.843328 4766 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.843349 4766 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.843376 4766 state_mem.go:36] "Initialized new in-memory state store" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.856271 4766 policy_none.go:49] "None policy: Start" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.857550 4766 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.857582 4766 state_mem.go:35] "Initializing new in-memory state store" Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.880817 4766 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.912266 4766 manager.go:334] "Starting Device Plugin manager" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.912417 4766 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.912439 4766 server.go:79] "Starting device plugin registration server" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.912900 4766 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.912917 4766 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.913164 4766 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.913256 4766 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.913272 4766 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.919809 4766 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.938101 4766 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.938251 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.939591 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.939658 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.939679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.939981 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.941378 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.941426 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.941443 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.941442 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.941632 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.941657 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.942252 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.942292 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.942694 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.942724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.942734 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.942734 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.942756 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.942770 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.942940 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.943044 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.943075 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.943321 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.943367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.943378 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.943935 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.943970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.943980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.944078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.944120 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.944134 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.944378 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.944821 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.944866 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.945456 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.945493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.945510 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.945522 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.945538 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.945553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.945738 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.945771 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.946594 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.946618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:58 crc kubenswrapper[4766]: I1209 03:11:58.946626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:58 crc kubenswrapper[4766]: E1209 03:11:58.984788 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="400ms" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.002832 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.002899 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.002962 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003109 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003173 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003195 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003257 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003317 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003364 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003417 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003458 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003491 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003519 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003563 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.003617 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.013513 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.014644 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.014701 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.014722 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.014768 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 03:11:59 crc kubenswrapper[4766]: E1209 03:11:59.015680 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.105396 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106004 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106172 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.105821 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106308 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106416 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106683 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106718 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106752 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106789 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106819 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106850 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106884 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106914 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106926 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106988 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107002 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107042 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106931 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.106946 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107110 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107121 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107060 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107144 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107173 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107179 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107086 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107196 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107313 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.107352 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.216897 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.218342 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.218391 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.218403 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.218434 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 03:11:59 crc kubenswrapper[4766]: E1209 03:11:59.218928 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.266404 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.278139 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.285889 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.307113 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.311882 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:11:59 crc kubenswrapper[4766]: W1209 03:11:59.316084 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cc9e75e8474294da85c0803062c2ed53fcf80736011aceb04723cca2cdcc47bc WatchSource:0}: Error finding container cc9e75e8474294da85c0803062c2ed53fcf80736011aceb04723cca2cdcc47bc: Status 404 returned error can't find the container with id cc9e75e8474294da85c0803062c2ed53fcf80736011aceb04723cca2cdcc47bc Dec 09 03:11:59 crc kubenswrapper[4766]: W1209 03:11:59.325743 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-53deb68e00d7bd68a33c6c1c473fab0c4f4d0b800c02c8c332dd42c989615da6 WatchSource:0}: Error finding container 53deb68e00d7bd68a33c6c1c473fab0c4f4d0b800c02c8c332dd42c989615da6: Status 404 returned error can't find the container with id 53deb68e00d7bd68a33c6c1c473fab0c4f4d0b800c02c8c332dd42c989615da6 Dec 09 03:11:59 crc kubenswrapper[4766]: W1209 03:11:59.336889 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b4d5622dc9c51c249c36f58e574616dfce37a7372437e8caf5917db8616b368e WatchSource:0}: Error finding container b4d5622dc9c51c249c36f58e574616dfce37a7372437e8caf5917db8616b368e: Status 404 returned error can't find the container with id b4d5622dc9c51c249c36f58e574616dfce37a7372437e8caf5917db8616b368e Dec 09 03:11:59 crc kubenswrapper[4766]: W1209 03:11:59.342314 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e7b1ee034707b555959f06efab7f50fbfb08d6c9b1adf6d36dca0b2ec4f0e096 WatchSource:0}: Error finding container e7b1ee034707b555959f06efab7f50fbfb08d6c9b1adf6d36dca0b2ec4f0e096: Status 404 returned error can't find the container with id e7b1ee034707b555959f06efab7f50fbfb08d6c9b1adf6d36dca0b2ec4f0e096 Dec 09 03:11:59 crc kubenswrapper[4766]: E1209 03:11:59.385719 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="800ms" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.619809 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.621683 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.621740 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.621751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.621781 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 03:11:59 crc kubenswrapper[4766]: E1209 03:11:59.622305 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Dec 09 03:11:59 crc kubenswrapper[4766]: W1209 03:11:59.703342 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:11:59 crc kubenswrapper[4766]: E1209 03:11:59.703448 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.775961 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.784185 4766 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 01:12:41.233394179 +0000 UTC Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.846460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.846634 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4d5622dc9c51c249c36f58e574616dfce37a7372437e8caf5917db8616b368e"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.848797 4766 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="731d57a43e5481755f5aade6fd96b713977c020c9adf77bdf8e690c6e3414d37" exitCode=0 Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.848860 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"731d57a43e5481755f5aade6fd96b713977c020c9adf77bdf8e690c6e3414d37"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.848890 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"53deb68e00d7bd68a33c6c1c473fab0c4f4d0b800c02c8c332dd42c989615da6"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.849043 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.850188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.850243 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.850253 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.850694 4766 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="59a1408e62adecfe4bfd6ab1a09ba2213f110fcd93fbf54ef7042febba52ccfd" exitCode=0 Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.850746 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"59a1408e62adecfe4bfd6ab1a09ba2213f110fcd93fbf54ef7042febba52ccfd"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.850776 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cc9e75e8474294da85c0803062c2ed53fcf80736011aceb04723cca2cdcc47bc"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.850826 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.851808 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.851831 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.851839 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.852910 4766 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1" exitCode=0 Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.852981 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.853005 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e7b1ee034707b555959f06efab7f50fbfb08d6c9b1adf6d36dca0b2ec4f0e096"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.853065 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.853818 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.853839 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.853848 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.855831 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350" exitCode=0 Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.855878 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.855911 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"496b20fab7316d3a1a951cc07d6dd0e60f50ebabaf2ba04ccfe4b6f895c54d39"} Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.856029 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.856984 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.857012 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.857022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.859677 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.860347 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.860369 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:11:59 crc kubenswrapper[4766]: I1209 03:11:59.860377 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:11:59 crc kubenswrapper[4766]: W1209 03:11:59.951707 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:11:59 crc kubenswrapper[4766]: E1209 03:11:59.951803 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:12:00 crc kubenswrapper[4766]: E1209 03:12:00.187916 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="1.6s" Dec 09 03:12:00 crc kubenswrapper[4766]: W1209 03:12:00.226062 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:12:00 crc kubenswrapper[4766]: E1209 03:12:00.226184 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:12:00 crc kubenswrapper[4766]: W1209 03:12:00.252352 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:12:00 crc kubenswrapper[4766]: E1209 03:12:00.252451 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.422883 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.424429 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.424474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.424487 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.424516 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 03:12:00 crc kubenswrapper[4766]: E1209 03:12:00.424963 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.767723 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 09 03:12:00 crc kubenswrapper[4766]: E1209 03:12:00.769152 4766 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.775128 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.785338 4766 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 17:32:38.247122506 +0000 UTC Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.785431 4766 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 110h20m37.461694355s for next certificate rotation Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.862269 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.862346 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.862361 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.862375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.865171 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.865254 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.865273 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.865274 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.866413 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.866448 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.866461 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.868002 4766 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d69b2347a0ec67392bdcf4654ce0514d17519e3edd44b53fe24bc2cf49372de7" exitCode=0 Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.868055 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d69b2347a0ec67392bdcf4654ce0514d17519e3edd44b53fe24bc2cf49372de7"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.868112 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.868948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.868984 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.868996 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.869766 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4fe076c13be85d36b4a984a037b1f6958db9073dcb94ba38908145259d43ac38"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.869870 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.870757 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.870794 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.870810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.874231 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.874283 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.874297 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8"} Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.874417 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.875239 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.875270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:00 crc kubenswrapper[4766]: I1209 03:12:00.875282 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.882700 4766 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e49edd84f63ae4c39e39494cadfe96efd8db014c741eb998908fb3cc1da1cbea" exitCode=0 Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.882784 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e49edd84f63ae4c39e39494cadfe96efd8db014c741eb998908fb3cc1da1cbea"} Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.882861 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.883825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.883863 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.883876 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.887051 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61"} Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.887106 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.887146 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.888113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.888157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.888167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.888731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.888788 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:01 crc kubenswrapper[4766]: I1209 03:12:01.888810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.025847 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.026987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.027030 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.027047 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.027079 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.896475 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e6b7e06a5ab1868b3d41878b8937875144566718e10806233e98b9754154d541"} Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.896555 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"78f00948adfba79a2dfa1dc0463471b712100d1f5ec7f472abb97ab790bd8d99"} Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.896575 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d6a5ddd7c86e52a226dc7b865d48b03fa9beac96c7be800d0f64a1c109524922"} Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.896600 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.896505 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.896701 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.896613 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8c3ec085d869d34e526512884b5dad62cc1fe8eeab2b20e5e7394f55666dcc2e"} Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.896935 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c58a36af88a1c9fafe479878b3878809abec590395a8eb69feaa9df1d7ff8534"} Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.898282 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.898318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.898335 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.898410 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.898463 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:02 crc kubenswrapper[4766]: I1209 03:12:02.898484 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.043799 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.044075 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.046254 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.046321 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.046333 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.273520 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.391602 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.391866 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.393873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.393939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.393962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.400953 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.773008 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.898821 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.898989 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.899005 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.900149 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.900270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.900293 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.900881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.900940 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.900957 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.900904 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.901031 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:03 crc kubenswrapper[4766]: I1209 03:12:03.901050 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.039153 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.812079 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.902819 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.902863 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.902922 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.904975 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.905013 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.905024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.905026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.905062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.905108 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.905119 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.905070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:04 crc kubenswrapper[4766]: I1209 03:12:04.905160 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:06 crc kubenswrapper[4766]: I1209 03:12:06.430758 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:12:06 crc kubenswrapper[4766]: I1209 03:12:06.431266 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:06 crc kubenswrapper[4766]: I1209 03:12:06.432804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:06 crc kubenswrapper[4766]: I1209 03:12:06.432870 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:06 crc kubenswrapper[4766]: I1209 03:12:06.432895 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:07 crc kubenswrapper[4766]: I1209 03:12:07.626457 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 09 03:12:07 crc kubenswrapper[4766]: I1209 03:12:07.626659 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:07 crc kubenswrapper[4766]: I1209 03:12:07.628069 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:07 crc kubenswrapper[4766]: I1209 03:12:07.628100 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:07 crc kubenswrapper[4766]: I1209 03:12:07.628109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:08 crc kubenswrapper[4766]: I1209 03:12:08.352968 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:12:08 crc kubenswrapper[4766]: I1209 03:12:08.353178 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:08 crc kubenswrapper[4766]: I1209 03:12:08.354867 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:08 crc kubenswrapper[4766]: I1209 03:12:08.354905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:08 crc kubenswrapper[4766]: I1209 03:12:08.354916 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:08 crc kubenswrapper[4766]: E1209 03:12:08.919905 4766 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.163867 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.163985 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.693973 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.694185 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.695319 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.695380 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.695394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.699714 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.918372 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.919338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.919397 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:10 crc kubenswrapper[4766]: I1209 03:12:10.919406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:11 crc kubenswrapper[4766]: I1209 03:12:11.353594 4766 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 03:12:11 crc kubenswrapper[4766]: I1209 03:12:11.353744 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 03:12:11 crc kubenswrapper[4766]: I1209 03:12:11.612256 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 03:12:11 crc kubenswrapper[4766]: I1209 03:12:11.612333 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 03:12:11 crc kubenswrapper[4766]: I1209 03:12:11.617062 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 03:12:11 crc kubenswrapper[4766]: I1209 03:12:11.617175 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.307777 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.308086 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.310375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.310490 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.310569 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.329758 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.785054 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.785348 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.786981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.787029 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.787043 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.795750 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.926954 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.927036 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.928127 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.928164 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.928172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.929033 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.929117 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:13 crc kubenswrapper[4766]: I1209 03:12:13.929146 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.613202 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.616355 4766 trace.go:236] Trace[1117475599]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 03:12:01.857) (total time: 14758ms): Dec 09 03:12:16 crc kubenswrapper[4766]: Trace[1117475599]: ---"Objects listed" error: 14758ms (03:12:16.616) Dec 09 03:12:16 crc kubenswrapper[4766]: Trace[1117475599]: [14.758637631s] [14.758637631s] END Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.616405 4766 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.616609 4766 trace.go:236] Trace[295531297]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 03:12:03.415) (total time: 13200ms): Dec 09 03:12:16 crc kubenswrapper[4766]: Trace[295531297]: ---"Objects listed" error: 13200ms (03:12:16.616) Dec 09 03:12:16 crc kubenswrapper[4766]: Trace[295531297]: [13.200765996s] [13.200765996s] END Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.616647 4766 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.616975 4766 trace.go:236] Trace[1191274606]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 03:12:01.875) (total time: 14741ms): Dec 09 03:12:16 crc kubenswrapper[4766]: Trace[1191274606]: ---"Objects listed" error: 14741ms (03:12:16.616) Dec 09 03:12:16 crc kubenswrapper[4766]: Trace[1191274606]: [14.741246925s] [14.741246925s] END Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.617014 4766 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.617955 4766 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.619257 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.620674 4766 trace.go:236] Trace[1916670286]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 03:12:02.838) (total time: 13782ms): Dec 09 03:12:16 crc kubenswrapper[4766]: Trace[1916670286]: ---"Objects listed" error: 13782ms (03:12:16.620) Dec 09 03:12:16 crc kubenswrapper[4766]: Trace[1916670286]: [13.78241911s] [13.78241911s] END Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.620692 4766 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.623915 4766 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.777246 4766 apiserver.go:52] "Watching apiserver" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.779997 4766 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.780324 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.780837 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.780965 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.781144 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.781237 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.781144 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.781159 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.780831 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.781806 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.781905 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.782895 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.783252 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.783976 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.784559 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.785544 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.785856 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.786279 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.786377 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.786740 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.811915 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.819290 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.819651 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.819802 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.820315 4766 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.825553 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.828153 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.837691 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.841674 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.854735 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.868356 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.870455 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37384->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.870524 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37384->192.168.126.11:17697: read: connection reset by peer" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.870833 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.870877 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.879469 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.881012 4766 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.890813 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.902499 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.920907 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.920964 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.920990 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921016 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921042 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921064 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921083 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921106 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921130 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921154 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921182 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921204 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921248 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921274 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921298 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921326 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921350 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921379 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921403 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921426 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921453 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921479 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921502 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921528 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921652 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921683 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921720 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921745 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921765 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921794 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921839 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921863 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921888 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921910 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921936 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921956 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921979 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922004 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922029 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922053 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922079 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922100 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922124 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922148 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922169 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922189 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922234 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922259 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922285 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922306 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922331 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922355 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922377 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922399 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922424 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922449 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922474 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922499 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922525 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922549 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922572 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922600 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922667 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922695 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922717 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922739 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922763 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922789 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922815 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922839 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922867 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922892 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922917 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922939 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922963 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922987 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923033 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923057 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923097 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923120 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923142 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923166 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923192 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923235 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923261 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923284 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923311 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923335 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923358 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923380 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923403 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923428 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923450 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923477 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923501 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921293 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921343 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921467 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921566 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921629 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.921689 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922453 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922461 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922470 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922643 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922656 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922749 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922829 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.922892 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923015 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923283 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923405 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923465 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923457 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.923527 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:12:17.423503081 +0000 UTC m=+19.132808507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925621 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925681 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925709 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925711 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925742 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925773 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925809 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925865 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925895 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925927 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925966 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926004 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926038 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926063 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926087 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926115 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926144 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926170 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926197 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926256 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926283 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926310 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926338 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926362 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926390 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926416 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926442 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926466 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926495 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926521 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926545 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926570 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926595 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926619 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926643 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926668 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926714 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926737 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926765 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926795 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926826 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926852 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926875 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926907 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926931 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926955 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926985 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927024 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927051 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927078 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927103 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927131 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927155 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927178 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927228 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927255 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927282 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927306 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927331 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927359 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927385 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927411 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927435 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927461 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927488 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927518 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927572 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927600 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927626 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927653 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927678 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927705 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927737 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927764 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927789 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927816 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927846 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927873 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927897 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927921 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927948 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927975 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928000 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928027 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928054 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928084 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928108 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928137 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928166 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928190 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928239 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928265 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928306 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928332 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928360 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928408 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928438 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928464 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928491 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928516 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928544 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928603 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928648 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928707 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928740 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928786 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928935 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928980 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929006 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929035 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929064 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929092 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929117 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929201 4766 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929238 4766 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929252 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929268 4766 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929284 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929298 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929311 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929325 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929339 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929352 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929367 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929381 4766 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929394 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929419 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929434 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929448 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929462 4766 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929475 4766 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929492 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929508 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925968 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925967 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923782 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924035 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924073 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924137 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924284 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924312 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924456 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924527 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924839 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924854 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924860 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.924892 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.932894 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925072 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925200 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925192 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925393 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.933085 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.925550 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926240 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926265 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926459 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926472 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926491 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926662 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926680 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926737 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926874 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.926976 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927045 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927229 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927436 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927653 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.927995 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928012 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928067 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928233 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928269 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928285 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928292 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.923644 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.928698 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929110 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929140 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929184 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929269 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929491 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.929540 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.929589 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.933569 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:17.433541934 +0000 UTC m=+19.142847360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.930611 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.931312 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.931332 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.931601 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.931963 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.933610 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.932003 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.932005 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.932257 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.932117 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.932510 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.932643 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.932666 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.932724 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.933869 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.934109 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.934407 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.934758 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.934891 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.935067 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.935085 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.934904 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.935291 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.935296 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.935592 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.935802 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.936093 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.936140 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.936574 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.936595 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.936889 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.936957 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.937241 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.937269 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.937509 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.937897 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.938064 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.938063 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.938172 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.938181 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.938276 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.938127 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.938435 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.938553 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.938649 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.939168 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.939080 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.939293 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.939480 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.939439 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.940064 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.940239 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.940124 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.940626 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.940652 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.940923 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.941002 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.941013 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.941158 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.941236 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.941681 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.941773 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.941947 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.942074 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.942304 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.942423 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.942572 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.942636 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:17.442618669 +0000 UTC m=+19.151924095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.942668 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.942900 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.943186 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.943307 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.943477 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.943499 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.943527 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.943652 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.943853 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.943922 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.944327 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.944370 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.944768 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.944521 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.944829 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.944901 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.944780 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.945285 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.945286 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.945495 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.945706 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.946017 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.946053 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.946319 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.946743 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.946854 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.946923 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.948264 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.948376 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.949504 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61" exitCode=255 Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.949540 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61"} Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.950379 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.950715 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.951297 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.951555 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.951665 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.951920 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.952510 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.952814 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.952854 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.952868 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.952931 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:17.452917839 +0000 UTC m=+19.162223265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.953611 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.953715 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.953736 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.953750 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:16 crc kubenswrapper[4766]: E1209 03:12:16.953806 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:17.453786482 +0000 UTC m=+19.163091908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.954124 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.954131 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.954154 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.954292 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.954943 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.955331 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.955489 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.955739 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.955786 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.956700 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.957364 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.957696 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.957750 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.958189 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.958188 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.958309 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.958435 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.958527 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.959808 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.965416 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.965711 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.966035 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.966357 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.965762 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.967535 4766 scope.go:117] "RemoveContainer" containerID="36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.971945 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.972021 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.973000 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.981079 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.982986 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.988688 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:16 crc kubenswrapper[4766]: I1209 03:12:16.997910 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.005533 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.008028 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.012569 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.020171 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030286 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030389 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030404 4766 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030415 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030426 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030435 4766 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030480 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030525 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030565 4766 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030585 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030599 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030611 4766 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030623 4766 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030634 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030648 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030661 4766 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030673 4766 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030685 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030696 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030706 4766 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030717 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030729 4766 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030739 4766 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030750 4766 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030761 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030772 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030784 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030794 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030805 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030815 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030825 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030836 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030847 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030858 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030869 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030879 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030890 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030901 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030915 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030928 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030942 4766 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030953 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030965 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030975 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030985 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.030996 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031006 4766 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031017 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031028 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031042 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031070 4766 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031084 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031097 4766 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031137 4766 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031153 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031167 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031181 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031193 4766 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031205 4766 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031240 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031254 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031267 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031279 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031294 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031307 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031322 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031335 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031350 4766 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031365 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031377 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031389 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031401 4766 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031412 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031422 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031433 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031474 4766 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031626 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031723 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031733 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031744 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031755 4766 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031765 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031776 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031786 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031796 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031806 4766 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031816 4766 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031826 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031836 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031846 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031857 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031867 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031877 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031886 4766 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031896 4766 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031905 4766 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031916 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031925 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031935 4766 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031946 4766 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031958 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031969 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031978 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.031989 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032000 4766 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032011 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032021 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032032 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032042 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032052 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032064 4766 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032074 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032084 4766 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032096 4766 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032107 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032117 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032127 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032137 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032147 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032161 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032176 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032189 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032200 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032225 4766 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032236 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032247 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032257 4766 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032267 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032279 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032289 4766 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032299 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032342 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032352 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032364 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032769 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032783 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032794 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032804 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032816 4766 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032827 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032841 4766 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032852 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032864 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032875 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032886 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032899 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032909 4766 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032921 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032931 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032942 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032953 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032963 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032974 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032985 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032995 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033004 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033015 4766 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033025 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033035 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033045 4766 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033056 4766 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033066 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033076 4766 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033086 4766 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033097 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033107 4766 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033117 4766 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033129 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033141 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033154 4766 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.032502 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033166 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033179 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033189 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033201 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033225 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.033236 4766 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.095119 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.101018 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.108147 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 03:12:17 crc kubenswrapper[4766]: W1209 03:12:17.127662 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-cefec998e98161a75d03d72b4bdab3f27c02cc98512fe3381b75325bc372c89d WatchSource:0}: Error finding container cefec998e98161a75d03d72b4bdab3f27c02cc98512fe3381b75325bc372c89d: Status 404 returned error can't find the container with id cefec998e98161a75d03d72b4bdab3f27c02cc98512fe3381b75325bc372c89d Dec 09 03:12:17 crc kubenswrapper[4766]: W1209 03:12:17.129927 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-32c28a56113e38738f1c2986b4a20d29bd12ee21315d8cc7ed975cf4c149c4c1 WatchSource:0}: Error finding container 32c28a56113e38738f1c2986b4a20d29bd12ee21315d8cc7ed975cf4c149c4c1: Status 404 returned error can't find the container with id 32c28a56113e38738f1c2986b4a20d29bd12ee21315d8cc7ed975cf4c149c4c1 Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.437940 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.438059 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.439103 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.439185 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:18.439172053 +0000 UTC m=+20.148477479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.440000 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:12:18.439992735 +0000 UTC m=+20.149298161 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.538932 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.538977 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.538996 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539092 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539136 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:18.53912488 +0000 UTC m=+20.248430306 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539509 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539521 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539531 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539554 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:18.539547421 +0000 UTC m=+20.248852847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539591 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539598 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539605 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:17 crc kubenswrapper[4766]: E1209 03:12:17.539624 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:18.539618883 +0000 UTC m=+20.248924309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.954909 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e"} Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.954975 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cefec998e98161a75d03d72b4bdab3f27c02cc98512fe3381b75325bc372c89d"} Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.955798 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"32c28a56113e38738f1c2986b4a20d29bd12ee21315d8cc7ed975cf4c149c4c1"} Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.958390 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc"} Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.958458 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be"} Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.958475 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b657e2b4717c719f0955650b4680fbbeecf704adec52fe8bc91eae2f34f88450"} Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.960974 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.962702 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e"} Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.963353 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.971467 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:17Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:17 crc kubenswrapper[4766]: I1209 03:12:17.985097 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:17Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.007169 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:17Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.019974 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.037149 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.048067 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.060639 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.075244 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.088794 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.104281 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.117587 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.133078 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.150456 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.169739 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.357702 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.362639 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.368005 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.373453 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.386104 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.401827 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.418906 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.432408 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.445486 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.446713 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.446806 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.446907 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:12:20.446886303 +0000 UTC m=+22.156191729 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.446912 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.446961 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:20.446954785 +0000 UTC m=+22.156260211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.456410 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.471635 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.484034 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.500790 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.513903 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.524417 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.539436 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.547863 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.547901 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.547930 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548030 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548051 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548060 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548067 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548079 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548095 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548174 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548105 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:20.548093065 +0000 UTC m=+22.257398491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548239 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:20.548205668 +0000 UTC m=+22.257511094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.548250 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:20.548245089 +0000 UTC m=+22.257550515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.557570 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.575422 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.838435 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.838633 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.838732 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.838657 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.838907 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:18 crc kubenswrapper[4766]: E1209 03:12:18.839017 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.845680 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.847031 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.849828 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.850926 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.851506 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.851984 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.852690 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.853455 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.854286 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.854323 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.854945 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.855547 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.856305 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.856895 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.857489 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.858033 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.858625 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.859191 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.861159 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.862982 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.865862 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.867127 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.868734 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.871310 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.872951 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.875424 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.877383 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.879326 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.880313 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.880840 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.882134 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.883584 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.884866 4766 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.885104 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.890513 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.892329 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.894344 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.896623 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.897405 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.898475 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.899146 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.900317 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.900799 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.900923 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.901899 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.902545 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.903567 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.904695 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.905634 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.906231 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.907381 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.907856 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.908764 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.909261 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.909796 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.910790 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.911332 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.914233 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.926601 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.945631 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.962055 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:18 crc kubenswrapper[4766]: I1209 03:12:18.979718 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.820283 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.822919 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.822962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.822972 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.823041 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.832556 4766 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.832943 4766 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.834399 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.834449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.834467 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.834493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.834513 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:19Z","lastTransitionTime":"2025-12-09T03:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:19 crc kubenswrapper[4766]: E1209 03:12:19.866367 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.872710 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.872753 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.872762 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.872778 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.872789 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:19Z","lastTransitionTime":"2025-12-09T03:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:19 crc kubenswrapper[4766]: E1209 03:12:19.887980 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.893455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.893504 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.893516 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.893534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.893546 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:19Z","lastTransitionTime":"2025-12-09T03:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:19 crc kubenswrapper[4766]: E1209 03:12:19.908261 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.913061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.913128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.913139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.913161 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.913176 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:19Z","lastTransitionTime":"2025-12-09T03:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:19 crc kubenswrapper[4766]: E1209 03:12:19.929087 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.934341 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.934390 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.934404 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.934425 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.934440 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:19Z","lastTransitionTime":"2025-12-09T03:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:19 crc kubenswrapper[4766]: E1209 03:12:19.949613 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:19 crc kubenswrapper[4766]: E1209 03:12:19.950000 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.952662 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.952697 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.952706 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.952721 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.952733 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:19Z","lastTransitionTime":"2025-12-09T03:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.969767 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf"} Dec 09 03:12:19 crc kubenswrapper[4766]: I1209 03:12:19.991667 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.012781 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:20Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.032961 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:20Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.048274 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:20Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.055408 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.055458 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.055478 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.055504 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.055520 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.062491 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:20Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.079431 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:20Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.091615 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:20Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.106103 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:20Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.158796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.158859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.158877 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.158905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.158923 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.263023 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.263121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.263139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.263163 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.263182 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.366637 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.366697 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.366716 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.366745 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.366766 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.465450 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.465590 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.465728 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.465823 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:24.46579659 +0000 UTC m=+26.175102046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.465926 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:12:24.465857111 +0000 UTC m=+26.175162577 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.469927 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.469980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.469998 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.470024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.470040 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.567132 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.567882 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.568156 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.567428 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.568690 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.568871 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.569131 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:24.569100549 +0000 UTC m=+26.278406005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.567997 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.568256 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.569633 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:24.569603192 +0000 UTC m=+26.278908658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.569659 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.569909 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.570088 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:24.570068714 +0000 UTC m=+26.279374180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.573169 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.573269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.573290 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.573320 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.573341 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.677365 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.677437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.677456 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.677482 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.677499 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.781131 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.781202 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.781261 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.781296 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.781323 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.838463 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.838569 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.838602 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.838756 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.838997 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:20 crc kubenswrapper[4766]: E1209 03:12:20.839178 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.884515 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.884589 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.884599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.884616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.884628 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.986676 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.986750 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.986765 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.986789 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:20 crc kubenswrapper[4766]: I1209 03:12:20.986805 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:20Z","lastTransitionTime":"2025-12-09T03:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.089522 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.089613 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.089628 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.089659 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.089674 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:21Z","lastTransitionTime":"2025-12-09T03:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.193672 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.193762 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.193791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.193829 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.193863 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:21Z","lastTransitionTime":"2025-12-09T03:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.296874 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.296933 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.296949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.296974 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.296989 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:21Z","lastTransitionTime":"2025-12-09T03:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.399782 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.399851 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.399865 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.399886 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.399902 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:21Z","lastTransitionTime":"2025-12-09T03:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.502437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.502491 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.502504 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.502525 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.502541 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:21Z","lastTransitionTime":"2025-12-09T03:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.605964 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.606088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.606116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.606152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.606179 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:21Z","lastTransitionTime":"2025-12-09T03:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.710322 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.710388 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.710402 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.710428 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.710455 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:21Z","lastTransitionTime":"2025-12-09T03:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.813121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.813201 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.813252 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.813284 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.813315 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:21Z","lastTransitionTime":"2025-12-09T03:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.917167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.917285 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.917315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.917355 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:21 crc kubenswrapper[4766]: I1209 03:12:21.917383 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:21Z","lastTransitionTime":"2025-12-09T03:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.019479 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.019521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.019530 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.019544 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.019554 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.123719 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.123783 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.123795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.123816 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.123830 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.226149 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.226197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.226230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.226251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.226264 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.330396 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.330486 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.330510 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.330546 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.330569 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.432599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.432640 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.432650 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.432665 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.432677 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.534876 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.534917 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.534926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.534939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.534949 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.637018 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.637054 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.637062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.637075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.637085 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.739163 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.739250 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.739268 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.739290 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.739307 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.838160 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.838257 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.838194 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:22 crc kubenswrapper[4766]: E1209 03:12:22.838425 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:22 crc kubenswrapper[4766]: E1209 03:12:22.838552 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:22 crc kubenswrapper[4766]: E1209 03:12:22.838636 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.842811 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.842842 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.842853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.842870 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.842881 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.946101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.946144 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.946153 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.946169 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:22 crc kubenswrapper[4766]: I1209 03:12:22.946180 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:22Z","lastTransitionTime":"2025-12-09T03:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.048475 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.048515 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.048527 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.048542 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.048553 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.150727 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.150769 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.150781 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.150798 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.150810 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.253206 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.253265 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.253276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.253294 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.253306 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.264004 4766 csr.go:261] certificate signing request csr-749nr is approved, waiting to be issued Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.324420 4766 csr.go:257] certificate signing request csr-749nr is issued Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.355314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.355355 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.355364 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.355376 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.355386 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.457894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.457923 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.457932 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.457945 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.457953 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.560550 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.560584 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.560594 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.560610 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.560621 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.662585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.662612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.662622 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.662635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.662644 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.765367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.765398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.765406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.765419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.765428 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.867853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.867895 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.867908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.867925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.867937 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.970732 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.970789 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.970799 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.970813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:23 crc kubenswrapper[4766]: I1209 03:12:23.970822 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:23Z","lastTransitionTime":"2025-12-09T03:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.073120 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.073158 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.073188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.073208 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.073234 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:24Z","lastTransitionTime":"2025-12-09T03:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.074121 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7rlr6"] Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.074460 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7rlr6" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.081175 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.081261 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-db9hx"] Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.081335 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.081526 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.081647 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xm6zk"] Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.081754 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.082167 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gx9l2"] Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.082341 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.082395 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.086874 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.088228 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.088324 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.089191 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.089634 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.089644 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.090101 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.092142 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.092165 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.097536 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.097707 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.097727 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.137137 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.162273 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.175329 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.175362 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.175374 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.175389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.175401 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:24Z","lastTransitionTime":"2025-12-09T03:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.176423 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.192058 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.207329 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.224615 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232397 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-os-release\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232473 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-daemon-config\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232532 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9s2s\" (UniqueName: \"kubernetes.io/projected/bfda2870-98c2-41d7-82f4-45e9b5b18460-kube-api-access-g9s2s\") pod \"node-resolver-7rlr6\" (UID: \"bfda2870-98c2-41d7-82f4-45e9b5b18460\") " pod="openshift-dns/node-resolver-7rlr6" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232579 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-cni-dir\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232628 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-run-k8s-cni-cncf-io\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232666 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-var-lib-cni-multus\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232704 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-var-lib-cni-bin\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232736 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-hostroot\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232789 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rzz4\" (UniqueName: \"kubernetes.io/projected/99b9b55d-a081-4c84-8535-58468c316659-kube-api-access-9rzz4\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232877 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-system-cni-dir\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232959 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-cnibin\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.232985 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bfda2870-98c2-41d7-82f4-45e9b5b18460-hosts-file\") pod \"node-resolver-7rlr6\" (UID: \"bfda2870-98c2-41d7-82f4-45e9b5b18460\") " pod="openshift-dns/node-resolver-7rlr6" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233004 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfzcd\" (UniqueName: \"kubernetes.io/projected/c83a9d31-9c87-4a13-ab9a-2992e852eb47-kube-api-access-jfzcd\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233026 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-os-release\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233063 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99b9b55d-a081-4c84-8535-58468c316659-cni-binary-copy\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233083 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-socket-dir-parent\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233103 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-run-netns\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233127 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a42b369b-e4ad-447c-b9b1-5c2461116838-proxy-tls\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233147 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a42b369b-e4ad-447c-b9b1-5c2461116838-mcd-auth-proxy-config\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-etc-kubernetes\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233182 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233199 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a42b369b-e4ad-447c-b9b1-5c2461116838-rootfs\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233237 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-run-multus-certs\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233268 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c83a9d31-9c87-4a13-ab9a-2992e852eb47-cni-binary-copy\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233304 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-conf-dir\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233324 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99b9b55d-a081-4c84-8535-58468c316659-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233344 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2rzs\" (UniqueName: \"kubernetes.io/projected/a42b369b-e4ad-447c-b9b1-5c2461116838-kube-api-access-d2rzs\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233364 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-cnibin\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233384 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-var-lib-kubelet\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.233439 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-system-cni-dir\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.239558 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.252858 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.268938 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.277885 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.277920 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.277929 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.277945 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.277955 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:24Z","lastTransitionTime":"2025-12-09T03:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.286480 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.300499 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.310973 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.321657 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.326205 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-09 03:07:23 +0000 UTC, rotation deadline is 2026-08-23 11:09:53.413766809 +0000 UTC Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.326270 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6175h57m29.087498831s for next certificate rotation Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.333690 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.333867 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-run-k8s-cni-cncf-io\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.333894 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-var-lib-cni-multus\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.333916 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-var-lib-cni-bin\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.333934 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-hostroot\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.333961 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rzz4\" (UniqueName: \"kubernetes.io/projected/99b9b55d-a081-4c84-8535-58468c316659-kube-api-access-9rzz4\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.333981 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-system-cni-dir\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334005 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-cnibin\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334022 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bfda2870-98c2-41d7-82f4-45e9b5b18460-hosts-file\") pod \"node-resolver-7rlr6\" (UID: \"bfda2870-98c2-41d7-82f4-45e9b5b18460\") " pod="openshift-dns/node-resolver-7rlr6" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334042 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfzcd\" (UniqueName: \"kubernetes.io/projected/c83a9d31-9c87-4a13-ab9a-2992e852eb47-kube-api-access-jfzcd\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334051 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-var-lib-cni-bin\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334050 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-run-k8s-cni-cncf-io\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334105 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bfda2870-98c2-41d7-82f4-45e9b5b18460-hosts-file\") pod \"node-resolver-7rlr6\" (UID: \"bfda2870-98c2-41d7-82f4-45e9b5b18460\") " pod="openshift-dns/node-resolver-7rlr6" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334122 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-os-release\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334133 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-system-cni-dir\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334096 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-cnibin\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334058 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-os-release\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334055 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-var-lib-cni-multus\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334084 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-hostroot\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334204 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99b9b55d-a081-4c84-8535-58468c316659-cni-binary-copy\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334250 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-socket-dir-parent\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334273 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-run-netns\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334308 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a42b369b-e4ad-447c-b9b1-5c2461116838-proxy-tls\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334328 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a42b369b-e4ad-447c-b9b1-5c2461116838-mcd-auth-proxy-config\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334331 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-socket-dir-parent\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334346 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-etc-kubernetes\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334366 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334388 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a42b369b-e4ad-447c-b9b1-5c2461116838-rootfs\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334413 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-run-multus-certs\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334441 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c83a9d31-9c87-4a13-ab9a-2992e852eb47-cni-binary-copy\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334506 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-conf-dir\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334531 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99b9b55d-a081-4c84-8535-58468c316659-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334556 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2rzs\" (UniqueName: \"kubernetes.io/projected/a42b369b-e4ad-447c-b9b1-5c2461116838-kube-api-access-d2rzs\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334581 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-cnibin\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334600 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-var-lib-kubelet\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334623 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-system-cni-dir\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334646 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-cni-dir\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334664 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-os-release\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334684 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-daemon-config\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334710 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9s2s\" (UniqueName: \"kubernetes.io/projected/bfda2870-98c2-41d7-82f4-45e9b5b18460-kube-api-access-g9s2s\") pod \"node-resolver-7rlr6\" (UID: \"bfda2870-98c2-41d7-82f4-45e9b5b18460\") " pod="openshift-dns/node-resolver-7rlr6" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334932 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a42b369b-e4ad-447c-b9b1-5c2461116838-rootfs\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.334966 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-run-multus-certs\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335016 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335097 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-etc-kubernetes\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335172 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-run-netns\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335257 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99b9b55d-a081-4c84-8535-58468c316659-cni-binary-copy\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335264 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-host-var-lib-kubelet\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335324 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-conf-dir\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335349 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b9b55d-a081-4c84-8535-58468c316659-system-cni-dir\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335402 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-cni-dir\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335426 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a42b369b-e4ad-447c-b9b1-5c2461116838-mcd-auth-proxy-config\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335427 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-os-release\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335468 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c83a9d31-9c87-4a13-ab9a-2992e852eb47-cnibin\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335688 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c83a9d31-9c87-4a13-ab9a-2992e852eb47-cni-binary-copy\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.335957 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c83a9d31-9c87-4a13-ab9a-2992e852eb47-multus-daemon-config\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.336474 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99b9b55d-a081-4c84-8535-58468c316659-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.341771 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a42b369b-e4ad-447c-b9b1-5c2461116838-proxy-tls\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.350667 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.351603 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rzz4\" (UniqueName: \"kubernetes.io/projected/99b9b55d-a081-4c84-8535-58468c316659-kube-api-access-9rzz4\") pod \"multus-additional-cni-plugins-xm6zk\" (UID: \"99b9b55d-a081-4c84-8535-58468c316659\") " pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.355796 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfzcd\" (UniqueName: \"kubernetes.io/projected/c83a9d31-9c87-4a13-ab9a-2992e852eb47-kube-api-access-jfzcd\") pod \"multus-gx9l2\" (UID: \"c83a9d31-9c87-4a13-ab9a-2992e852eb47\") " pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.356366 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2rzs\" (UniqueName: \"kubernetes.io/projected/a42b369b-e4ad-447c-b9b1-5c2461116838-kube-api-access-d2rzs\") pod \"machine-config-daemon-db9hx\" (UID: \"a42b369b-e4ad-447c-b9b1-5c2461116838\") " pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.359533 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9s2s\" (UniqueName: \"kubernetes.io/projected/bfda2870-98c2-41d7-82f4-45e9b5b18460-kube-api-access-g9s2s\") pod \"node-resolver-7rlr6\" (UID: \"bfda2870-98c2-41d7-82f4-45e9b5b18460\") " pod="openshift-dns/node-resolver-7rlr6" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.366330 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.380284 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.380325 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.380336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.380352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.380362 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:24Z","lastTransitionTime":"2025-12-09T03:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.387449 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7rlr6" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.392405 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.394425 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.401544 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.407688 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gx9l2" Dec 09 03:12:24 crc kubenswrapper[4766]: W1209 03:12:24.410289 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfda2870_98c2_41d7_82f4_45e9b5b18460.slice/crio-c9821329d3f371a5fcdbfd3838eb5b7ff704171d602c43853e53404448b6e4c6 WatchSource:0}: Error finding container c9821329d3f371a5fcdbfd3838eb5b7ff704171d602c43853e53404448b6e4c6: Status 404 returned error can't find the container with id c9821329d3f371a5fcdbfd3838eb5b7ff704171d602c43853e53404448b6e4c6 Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.415249 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: W1209 03:12:24.421927 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b9b55d_a081_4c84_8535_58468c316659.slice/crio-434a6322e50712fca8af5da7ac0fc1bbbbadfd153b1e7616b5a3513f99c61c9f WatchSource:0}: Error finding container 434a6322e50712fca8af5da7ac0fc1bbbbadfd153b1e7616b5a3513f99c61c9f: Status 404 returned error can't find the container with id 434a6322e50712fca8af5da7ac0fc1bbbbadfd153b1e7616b5a3513f99c61c9f Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.446565 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.472657 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.484465 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.484498 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.484507 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.484521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.484531 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:24Z","lastTransitionTime":"2025-12-09T03:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.496517 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.524129 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-62t52"] Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.524996 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.528941 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.529238 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.529412 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.529576 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.529743 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.529889 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.535268 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.536401 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.536538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.536628 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.536685 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:32.536666448 +0000 UTC m=+34.245971874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.536768 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:12:32.536761061 +0000 UTC m=+34.246066487 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.552961 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.568600 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.581804 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.587999 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.588044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.588060 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.588086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.588104 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:24Z","lastTransitionTime":"2025-12-09T03:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.593181 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.609100 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.624272 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638098 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-slash\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638445 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-log-socket\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638477 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-ovn\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638503 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-config\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638553 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638577 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-kubelet\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638602 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-script-lib\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638650 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tw62\" (UniqueName: \"kubernetes.io/projected/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-kube-api-access-8tw62\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638679 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-systemd\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638704 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638730 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovn-node-metrics-cert\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638756 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-var-lib-openvswitch\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638783 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-etc-openvswitch\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-bin\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.638969 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.639171 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639262 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.639286 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-systemd-units\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639298 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.639312 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-netd\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639316 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639371 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639407 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.639408 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-openvswitch\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639429 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.639482 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639506 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:32.639478914 +0000 UTC m=+34.348784540 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639533 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:32.639524345 +0000 UTC m=+34.348830011 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.639560 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-netns\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639594 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.639627 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-env-overrides\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.639641 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:32.639626458 +0000 UTC m=+34.348931874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.639666 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-node-log\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.658027 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.670885 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.688220 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.690313 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.690344 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.690353 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.690370 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.690379 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:24Z","lastTransitionTime":"2025-12-09T03:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.705503 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.723123 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.740752 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-env-overrides\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.740807 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-node-log\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.740831 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-slash\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.740853 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-log-socket\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.740879 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-ovn\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.740905 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-config\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.740928 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-kubelet\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.740942 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-node-log\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741023 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.740965 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741089 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-slash\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741088 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-systemd\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741113 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-systemd\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741159 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-script-lib\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741251 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tw62\" (UniqueName: \"kubernetes.io/projected/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-kube-api-access-8tw62\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741291 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-etc-openvswitch\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741316 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-bin\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741345 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovn-node-metrics-cert\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741399 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-var-lib-openvswitch\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741470 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-systemd-units\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741493 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-netd\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741516 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-openvswitch\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741570 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-netns\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741653 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-netns\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741694 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-log-socket\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741700 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-env-overrides\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741749 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-kubelet\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741059 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-ovn\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741869 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-config\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741893 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-etc-openvswitch\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741929 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-netd\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741957 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.741982 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-openvswitch\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.742025 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-systemd-units\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.742099 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-var-lib-openvswitch\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.742025 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-bin\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.742444 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-script-lib\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.746899 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:24Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.748241 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovn-node-metrics-cert\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.763256 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tw62\" (UniqueName: \"kubernetes.io/projected/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-kube-api-access-8tw62\") pod \"ovnkube-node-62t52\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.793504 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.793546 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.793555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.793570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.793580 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:24Z","lastTransitionTime":"2025-12-09T03:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.839156 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.839204 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.839302 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.839162 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.839452 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:24 crc kubenswrapper[4766]: E1209 03:12:24.839560 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.862439 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.896543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.896591 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.896602 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.896618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.896627 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:24Z","lastTransitionTime":"2025-12-09T03:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:24 crc kubenswrapper[4766]: W1209 03:12:24.904251 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2c7a4fd_d2ea_4d01_85c3_beb2be74d9ec.slice/crio-e1e35f022910b1a7046ab8cdbfdb4cd4a191657cfb40af89d3014ce856160bd4 WatchSource:0}: Error finding container e1e35f022910b1a7046ab8cdbfdb4cd4a191657cfb40af89d3014ce856160bd4: Status 404 returned error can't find the container with id e1e35f022910b1a7046ab8cdbfdb4cd4a191657cfb40af89d3014ce856160bd4 Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.987201 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.987302 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.987323 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"01f91c82d8236f0857df8bde4803ca637dd49305229bb73d304ef0536a5cf19a"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.988404 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"e1e35f022910b1a7046ab8cdbfdb4cd4a191657cfb40af89d3014ce856160bd4"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.989844 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gx9l2" event={"ID":"c83a9d31-9c87-4a13-ab9a-2992e852eb47","Type":"ContainerStarted","Data":"d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.989887 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gx9l2" event={"ID":"c83a9d31-9c87-4a13-ab9a-2992e852eb47","Type":"ContainerStarted","Data":"d1cf93e61162fb5a4dd62a18a2fd4e098f740aebd32c9066f1db7c635f786470"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.991028 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7rlr6" event={"ID":"bfda2870-98c2-41d7-82f4-45e9b5b18460","Type":"ContainerStarted","Data":"549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.991052 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7rlr6" event={"ID":"bfda2870-98c2-41d7-82f4-45e9b5b18460","Type":"ContainerStarted","Data":"c9821329d3f371a5fcdbfd3838eb5b7ff704171d602c43853e53404448b6e4c6"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.993076 4766 generic.go:334] "Generic (PLEG): container finished" podID="99b9b55d-a081-4c84-8535-58468c316659" containerID="b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433" exitCode=0 Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.993130 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" event={"ID":"99b9b55d-a081-4c84-8535-58468c316659","Type":"ContainerDied","Data":"b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433"} Dec 09 03:12:24 crc kubenswrapper[4766]: I1209 03:12:24.993189 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" event={"ID":"99b9b55d-a081-4c84-8535-58468c316659","Type":"ContainerStarted","Data":"434a6322e50712fca8af5da7ac0fc1bbbbadfd153b1e7616b5a3513f99c61c9f"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.006192 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.006248 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.006272 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.006291 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.006305 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.013342 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.033289 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.050243 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.065779 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.077925 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.095687 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.109228 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.109264 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.109274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.109292 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.109304 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.110036 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.126289 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.137994 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.148992 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.163237 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.181554 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.193628 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.204837 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.212660 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.212702 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.212710 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.212725 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.212738 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.219353 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.230479 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.241962 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.258874 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.277527 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.290488 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.303719 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.315544 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.315604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.315629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.315654 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.315670 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.316718 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.327275 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.338160 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.352438 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.366598 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:25Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.419008 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.419044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.419054 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.419068 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.419078 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.521534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.521574 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.521589 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.521608 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.521622 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.624064 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.624400 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.624416 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.624433 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.624445 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.727897 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.728170 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.728409 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.728536 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.728631 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.830695 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.830752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.830763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.830795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.830811 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.934279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.934337 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.934348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.934373 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.934392 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:25Z","lastTransitionTime":"2025-12-09T03:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.998907 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0" exitCode=0 Dec 09 03:12:25 crc kubenswrapper[4766]: I1209 03:12:25.998934 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.002808 4766 generic.go:334] "Generic (PLEG): container finished" podID="99b9b55d-a081-4c84-8535-58468c316659" containerID="0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde" exitCode=0 Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.002838 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" event={"ID":"99b9b55d-a081-4c84-8535-58468c316659","Type":"ContainerDied","Data":"0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.021738 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.036540 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.036792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.036811 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.036825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.036840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.036851 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.053272 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.068107 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.091745 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.107788 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.121022 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.134267 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.141024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.141062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.141071 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.141087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.141099 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.147880 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.163774 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.178333 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.194468 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.211825 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.227948 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.244118 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.244343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.244374 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.244384 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.244401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.244411 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.255862 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.269452 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.286814 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.303484 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.321858 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.345573 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.347145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.347171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.347181 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.347197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.347225 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.362147 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.375181 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.390706 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.404552 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.417043 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:26Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.448997 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.449035 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.449047 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.449063 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.449075 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.551032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.551073 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.551081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.551098 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.551108 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.653672 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.653707 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.653717 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.653730 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.653740 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.756550 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.756591 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.756600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.756623 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.756633 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.838958 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.839012 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:26 crc kubenswrapper[4766]: E1209 03:12:26.839119 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.839309 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:26 crc kubenswrapper[4766]: E1209 03:12:26.839450 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:26 crc kubenswrapper[4766]: E1209 03:12:26.839579 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.861364 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.861416 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.861429 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.861455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.861472 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.965162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.965255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.965273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.965300 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:26 crc kubenswrapper[4766]: I1209 03:12:26.965322 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:26Z","lastTransitionTime":"2025-12-09T03:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.008651 4766 generic.go:334] "Generic (PLEG): container finished" podID="99b9b55d-a081-4c84-8535-58468c316659" containerID="36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b" exitCode=0 Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.008706 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" event={"ID":"99b9b55d-a081-4c84-8535-58468c316659","Type":"ContainerDied","Data":"36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.013085 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.013631 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.013648 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.013662 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.013673 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.013684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.028792 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.046715 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.061329 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.067912 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.067961 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.067975 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.067998 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.068011 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.077993 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.108371 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.121869 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.134806 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.148656 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.164142 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.173059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.173101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.173117 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.173141 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.173157 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.178093 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.196608 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.208926 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.225522 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:27Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.275832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.275888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.275910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.275932 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.275943 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.378761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.378797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.378807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.378821 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.378832 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.481719 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.481751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.481759 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.481773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.481784 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.584775 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.584838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.584858 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.585128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.585162 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.687427 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.687457 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.687466 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.687482 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.687490 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.789801 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.789848 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.789857 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.789871 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.789881 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.892032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.892073 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.892084 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.892100 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.892110 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.994680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.994724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.994737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.994755 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:27 crc kubenswrapper[4766]: I1209 03:12:27.994769 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:27Z","lastTransitionTime":"2025-12-09T03:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.021672 4766 generic.go:334] "Generic (PLEG): container finished" podID="99b9b55d-a081-4c84-8535-58468c316659" containerID="9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6" exitCode=0 Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.021743 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" event={"ID":"99b9b55d-a081-4c84-8535-58468c316659","Type":"ContainerDied","Data":"9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.041888 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.061631 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.079689 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.098859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.098925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.098940 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.098960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.098982 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:28Z","lastTransitionTime":"2025-12-09T03:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.102382 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.118290 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.135604 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.154957 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.180781 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.200984 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.202827 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.202866 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.202877 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.202897 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.203290 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:28Z","lastTransitionTime":"2025-12-09T03:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.209287 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-l7fcf"] Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.209730 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.211881 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.212384 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.212770 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.214766 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.223318 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.275775 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5d8f5525-c018-45f6-8f73-355ac763742e-serviceca\") pod \"node-ca-l7fcf\" (UID: \"5d8f5525-c018-45f6-8f73-355ac763742e\") " pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.275839 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d8f5525-c018-45f6-8f73-355ac763742e-host\") pod \"node-ca-l7fcf\" (UID: \"5d8f5525-c018-45f6-8f73-355ac763742e\") " pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.275858 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf7jm\" (UniqueName: \"kubernetes.io/projected/5d8f5525-c018-45f6-8f73-355ac763742e-kube-api-access-jf7jm\") pod \"node-ca-l7fcf\" (UID: \"5d8f5525-c018-45f6-8f73-355ac763742e\") " pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.285996 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.302196 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.306846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.306908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.306926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.306949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.306969 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:28Z","lastTransitionTime":"2025-12-09T03:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.317555 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.340912 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.367453 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.377465 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5d8f5525-c018-45f6-8f73-355ac763742e-serviceca\") pod \"node-ca-l7fcf\" (UID: \"5d8f5525-c018-45f6-8f73-355ac763742e\") " pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.377528 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d8f5525-c018-45f6-8f73-355ac763742e-host\") pod \"node-ca-l7fcf\" (UID: \"5d8f5525-c018-45f6-8f73-355ac763742e\") " pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.377551 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf7jm\" (UniqueName: \"kubernetes.io/projected/5d8f5525-c018-45f6-8f73-355ac763742e-kube-api-access-jf7jm\") pod \"node-ca-l7fcf\" (UID: \"5d8f5525-c018-45f6-8f73-355ac763742e\") " pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.377909 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d8f5525-c018-45f6-8f73-355ac763742e-host\") pod \"node-ca-l7fcf\" (UID: \"5d8f5525-c018-45f6-8f73-355ac763742e\") " pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.378886 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5d8f5525-c018-45f6-8f73-355ac763742e-serviceca\") pod \"node-ca-l7fcf\" (UID: \"5d8f5525-c018-45f6-8f73-355ac763742e\") " pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.382674 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.402559 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.402791 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf7jm\" (UniqueName: \"kubernetes.io/projected/5d8f5525-c018-45f6-8f73-355ac763742e-kube-api-access-jf7jm\") pod \"node-ca-l7fcf\" (UID: \"5d8f5525-c018-45f6-8f73-355ac763742e\") " pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.409826 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.409887 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.409901 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.409925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.409936 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:28Z","lastTransitionTime":"2025-12-09T03:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.417257 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.433197 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.448510 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.465488 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.478365 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.492683 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.508588 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.513576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.513630 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.513649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.513679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.513701 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:28Z","lastTransitionTime":"2025-12-09T03:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.523574 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l7fcf" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.526393 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.545012 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.562733 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.617247 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.617325 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.617346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.617378 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.617430 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:28Z","lastTransitionTime":"2025-12-09T03:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.694444 4766 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 09 03:12:28 crc kubenswrapper[4766]: W1209 03:12:28.696336 4766 reflector.go:484] object-"openshift-image-registry"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 09 03:12:28 crc kubenswrapper[4766]: W1209 03:12:28.696913 4766 reflector.go:484] object-"openshift-image-registry"/"image-registry-certificates": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"image-registry-certificates": Unexpected watch close - watch lasted less than a second and no items received Dec 09 03:12:28 crc kubenswrapper[4766]: W1209 03:12:28.697430 4766 reflector.go:484] object-"openshift-image-registry"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 09 03:12:28 crc kubenswrapper[4766]: W1209 03:12:28.697530 4766 reflector.go:484] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": watch of *v1.Secret ended with: very short watch: object-"openshift-image-registry"/"node-ca-dockercfg-4777p": Unexpected watch close - watch lasted less than a second and no items received Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.722732 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.722768 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.722780 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.722796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.722808 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:28Z","lastTransitionTime":"2025-12-09T03:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.825770 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.825820 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.825835 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.825955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.825986 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:28Z","lastTransitionTime":"2025-12-09T03:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.838669 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.838721 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:28 crc kubenswrapper[4766]: E1209 03:12:28.838806 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.838896 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:28 crc kubenswrapper[4766]: E1209 03:12:28.839057 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:28 crc kubenswrapper[4766]: E1209 03:12:28.839172 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.856291 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.871401 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.903007 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.916690 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.930479 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.930531 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.930543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.930564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.930577 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:28Z","lastTransitionTime":"2025-12-09T03:12:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.946712 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.963344 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.981126 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:28 crc kubenswrapper[4766]: I1209 03:12:28.994731 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.007717 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.024377 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.032450 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.032907 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.032938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.032969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.033002 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.035046 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.043792 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.044847 4766 generic.go:334] "Generic (PLEG): container finished" podID="99b9b55d-a081-4c84-8535-58468c316659" containerID="061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd" exitCode=0 Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.044961 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" event={"ID":"99b9b55d-a081-4c84-8535-58468c316659","Type":"ContainerDied","Data":"061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.047067 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l7fcf" event={"ID":"5d8f5525-c018-45f6-8f73-355ac763742e","Type":"ContainerStarted","Data":"774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.047717 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l7fcf" event={"ID":"5d8f5525-c018-45f6-8f73-355ac763742e","Type":"ContainerStarted","Data":"0ddf7fc703f5d10b4c3d44e3370f4908d3715fc5e83a2a5f56d086b512147032"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.058197 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.075328 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.089706 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.104126 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.121343 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.137057 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.141069 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.141137 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.141151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.141171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.141184 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.152579 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.167787 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.186584 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.204700 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.225541 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.241194 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.248366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.248459 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.248475 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.248981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.249038 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.262639 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.278115 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.289833 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.303441 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.316782 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.353535 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.353578 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.353591 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.353613 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.353623 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.456398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.456448 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.456460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.456484 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.456501 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.545385 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.560115 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.560223 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.560242 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.560264 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.560278 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.663648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.663736 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.663764 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.663802 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.663830 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.767409 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.767478 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.767494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.767517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.767531 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.872605 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.872709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.872731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.872770 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.872828 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.897291 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.918275 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.937147 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.938359 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.953891 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.972886 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.976616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.976777 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.976894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.976975 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.977056 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:29Z","lastTransitionTime":"2025-12-09T03:12:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:29 crc kubenswrapper[4766]: I1209 03:12:29.990445 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.009351 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.016552 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.033750 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.046245 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.059621 4766 generic.go:334] "Generic (PLEG): container finished" podID="99b9b55d-a081-4c84-8535-58468c316659" containerID="a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484" exitCode=0 Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.059700 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" event={"ID":"99b9b55d-a081-4c84-8535-58468c316659","Type":"ContainerDied","Data":"a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.064146 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.079658 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.084043 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.084098 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.084116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.084143 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.084162 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.094033 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.094108 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.094151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.094257 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.094288 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.096797 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: E1209 03:12:30.108007 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.110574 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.112015 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.112048 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.112057 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.112085 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.112101 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: E1209 03:12:30.127414 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.127831 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.138083 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.138302 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.138404 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.138474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.138549 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.146072 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: E1209 03:12:30.150358 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.154162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.154290 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.154311 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.154342 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.154362 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.163070 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: E1209 03:12:30.168188 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.172841 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.172883 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.172897 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.172917 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.172931 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.175266 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: E1209 03:12:30.186043 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: E1209 03:12:30.186160 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.188517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.188674 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.188689 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.188713 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.188728 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.188903 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.202538 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.217296 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.238934 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.239195 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.251648 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.270601 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.284739 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.291254 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.291289 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.291298 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.291313 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.291343 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.298702 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.310977 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.324945 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.339361 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.353823 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:30Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.393989 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.394043 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.394057 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.394080 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.394093 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.497375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.497423 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.497433 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.497454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.497467 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.601123 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.601237 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.601253 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.601275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.601289 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.704950 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.705003 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.705015 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.705036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.705048 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.808549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.808590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.808598 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.808613 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.808623 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.842400 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:30 crc kubenswrapper[4766]: E1209 03:12:30.842532 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.843002 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:30 crc kubenswrapper[4766]: E1209 03:12:30.843064 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.843183 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:30 crc kubenswrapper[4766]: E1209 03:12:30.843280 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.916942 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.917004 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.917026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.917054 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:30 crc kubenswrapper[4766]: I1209 03:12:30.917077 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:30Z","lastTransitionTime":"2025-12-09T03:12:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.019494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.019541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.019555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.019575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.019590 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.069404 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" event={"ID":"99b9b55d-a081-4c84-8535-58468c316659","Type":"ContainerStarted","Data":"abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.093657 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.109612 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.121691 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.121749 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.121766 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.121789 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.121801 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.133899 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.154258 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.175093 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.190295 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.207778 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.224740 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.224784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.224822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.224843 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.224855 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.230067 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.241784 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.256361 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.271170 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.285600 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.300132 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.319467 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:31Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.327782 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.327908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.327969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.328050 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.328120 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.431510 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.431555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.431566 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.431582 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.431594 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.535189 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.535258 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.535270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.535289 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.535302 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.637000 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.637038 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.637049 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.637065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.637076 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.739639 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.739677 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.739691 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.739706 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.739718 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.842094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.842146 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.842157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.842175 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.842185 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.945332 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.945431 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.945449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.945480 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:31 crc kubenswrapper[4766]: I1209 03:12:31.945501 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:31Z","lastTransitionTime":"2025-12-09T03:12:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.048703 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.048776 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.048797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.048825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.048847 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.079532 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.080359 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.080574 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.080601 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.105774 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.120760 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.120826 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.122974 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.143454 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.152769 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.152814 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.152829 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.152850 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.152861 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.160720 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.174068 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.197323 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.219114 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.241991 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.256089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.256151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.256169 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.256190 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.256204 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.257453 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.280106 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.299012 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.314047 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.332390 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.349035 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.359610 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.359677 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.359692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.359716 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.359733 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.370032 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.397831 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.417040 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.436125 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.450526 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.462993 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.463119 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.463167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.463180 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.463243 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.463262 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.478867 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.497690 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.515640 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.533594 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.556619 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.566054 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.566316 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.566474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.566638 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.566786 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.575455 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.590060 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.607183 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.626740 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.626993 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.627180 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:12:48.627143156 +0000 UTC m=+50.336448582 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.627235 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.627606 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:48.627557837 +0000 UTC m=+50.336863473 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.669950 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.670027 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.670047 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.670079 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.670101 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.727660 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.727734 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.727807 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.727981 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.727984 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.728011 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.728038 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.728038 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.728121 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:48.728096672 +0000 UTC m=+50.437402138 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.728168 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.728000 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.728273 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:48.728249346 +0000 UTC m=+50.437554772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.728479 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:48.728430001 +0000 UTC m=+50.437735457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.773836 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.773912 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.773932 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.773965 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.774174 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.838757 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.838822 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.838887 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.839048 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.839177 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:32 crc kubenswrapper[4766]: E1209 03:12:32.839414 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.877521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.877573 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.877585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.877604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.877616 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.980983 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.981059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.981071 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.981118 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:32 crc kubenswrapper[4766]: I1209 03:12:32.981132 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:32Z","lastTransitionTime":"2025-12-09T03:12:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.083487 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.083580 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.083609 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.083650 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.083681 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:33Z","lastTransitionTime":"2025-12-09T03:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.187045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.187088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.187098 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.187118 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.187129 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:33Z","lastTransitionTime":"2025-12-09T03:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.290894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.291408 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.291523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.291635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.291721 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:33Z","lastTransitionTime":"2025-12-09T03:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.394965 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.395011 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.395022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.395040 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.395054 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:33Z","lastTransitionTime":"2025-12-09T03:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.504124 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.504235 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.504251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.504278 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.504295 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:33Z","lastTransitionTime":"2025-12-09T03:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.608011 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.608078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.608099 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.608128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.608149 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:33Z","lastTransitionTime":"2025-12-09T03:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.712358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.712419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.712437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.712461 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.712476 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:33Z","lastTransitionTime":"2025-12-09T03:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.815284 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.815339 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.815357 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.815381 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.815396 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:33Z","lastTransitionTime":"2025-12-09T03:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.919860 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.919947 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.919967 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.919996 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:33 crc kubenswrapper[4766]: I1209 03:12:33.920017 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:33Z","lastTransitionTime":"2025-12-09T03:12:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.023262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.023329 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.023351 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.023385 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.023409 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:34Z","lastTransitionTime":"2025-12-09T03:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.839041 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.839041 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.839049 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:34 crc kubenswrapper[4766]: E1209 03:12:34.839264 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:34 crc kubenswrapper[4766]: E1209 03:12:34.839405 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:34 crc kubenswrapper[4766]: E1209 03:12:34.839526 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.880036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.880084 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.880096 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.880116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.880130 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:34Z","lastTransitionTime":"2025-12-09T03:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.886978 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/0.log" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.894177 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9" exitCode=1 Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.894268 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9"} Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.895356 4766 scope.go:117] "RemoveContainer" containerID="7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.914736 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:34Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.933521 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:34Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.947276 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:34Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.960527 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:34Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.977633 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:34Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.983791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.983871 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.983924 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.983955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.984005 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:34Z","lastTransitionTime":"2025-12-09T03:12:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:34 crc kubenswrapper[4766]: I1209 03:12:34.994842 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:34Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.010614 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.028326 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.042698 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.055382 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.067813 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.083101 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.095837 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.096110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.096270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.096455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.096599 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:35Z","lastTransitionTime":"2025-12-09T03:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.110956 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:33Z\\\",\\\"message\\\":\\\"val\\\\nI1209 03:12:33.468007 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1209 03:12:33.468037 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1209 03:12:33.468447 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:12:33.468454 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1209 03:12:33.468464 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:12:33.468518 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:12:33.468540 6072 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:12:33.468545 6072 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:12:33.468555 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:12:33.468560 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:12:33.468574 6072 factory.go:656] Stopping watch factory\\\\nI1209 03:12:33.468592 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1209 03:12:33.468612 6072 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:12:33.468622 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:12:33.468628 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:12:33.468634 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.124740 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.200539 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.200613 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.200635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.200667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.200690 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:35Z","lastTransitionTime":"2025-12-09T03:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.304619 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.304779 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.305305 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.305526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.305593 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:35Z","lastTransitionTime":"2025-12-09T03:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.409148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.409240 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.409265 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.409295 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.409315 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:35Z","lastTransitionTime":"2025-12-09T03:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.512441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.512547 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.512575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.512614 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.512639 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:35Z","lastTransitionTime":"2025-12-09T03:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.615616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.616162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.616184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.616238 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.616259 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:35Z","lastTransitionTime":"2025-12-09T03:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.720017 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.720089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.720110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.720144 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.720164 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:35Z","lastTransitionTime":"2025-12-09T03:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.823327 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.823394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.823415 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.823445 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.823468 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:35Z","lastTransitionTime":"2025-12-09T03:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.901840 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/0.log" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.905516 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.906119 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.922421 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.926503 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.926553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.926567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.926596 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.926614 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:35Z","lastTransitionTime":"2025-12-09T03:12:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.942711 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.964570 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:33Z\\\",\\\"message\\\":\\\"val\\\\nI1209 03:12:33.468007 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1209 03:12:33.468037 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1209 03:12:33.468447 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:12:33.468454 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1209 03:12:33.468464 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:12:33.468518 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:12:33.468540 6072 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:12:33.468545 6072 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:12:33.468555 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:12:33.468560 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:12:33.468574 6072 factory.go:656] Stopping watch factory\\\\nI1209 03:12:33.468592 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1209 03:12:33.468612 6072 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:12:33.468622 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:12:33.468628 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:12:33.468634 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.977733 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:35 crc kubenswrapper[4766]: I1209 03:12:35.995390 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:35Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.011809 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.024399 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.028899 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.028975 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.028991 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.029018 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.029037 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.037126 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.052391 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.067399 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.083113 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.100718 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.115404 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.130336 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.132152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.132187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.132200 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.132241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.132256 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.236554 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.236625 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.236648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.236678 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.236697 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.299963 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc"] Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.300632 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.304567 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.306058 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.318385 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.331486 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.346091 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.346122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.346133 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.346154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.346165 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.346233 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.359555 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.372456 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.372693 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.372915 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.372988 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.373036 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w59ps\" (UniqueName: \"kubernetes.io/projected/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-kube-api-access-w59ps\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.383533 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.405383 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.420520 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.434002 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.449173 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.450281 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.450349 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.450366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.450391 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.450410 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.467912 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.474725 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.474847 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.474939 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.474980 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w59ps\" (UniqueName: \"kubernetes.io/projected/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-kube-api-access-w59ps\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.476112 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.476586 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.483462 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.488673 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.498889 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w59ps\" (UniqueName: \"kubernetes.io/projected/3520fc8f-8421-4ce0-b98a-f08f96ce2f2c-kube-api-access-w59ps\") pod \"ovnkube-control-plane-749d76644c-nc6gc\" (UID: \"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.506291 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.528566 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:33Z\\\",\\\"message\\\":\\\"val\\\\nI1209 03:12:33.468007 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1209 03:12:33.468037 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1209 03:12:33.468447 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:12:33.468454 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1209 03:12:33.468464 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:12:33.468518 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:12:33.468540 6072 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:12:33.468545 6072 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:12:33.468555 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:12:33.468560 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:12:33.468574 6072 factory.go:656] Stopping watch factory\\\\nI1209 03:12:33.468592 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1209 03:12:33.468612 6072 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:12:33.468622 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:12:33.468628 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:12:33.468634 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.543161 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.554120 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.554156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.554168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.554188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.554200 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.618443 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" Dec 09 03:12:36 crc kubenswrapper[4766]: W1209 03:12:36.636476 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3520fc8f_8421_4ce0_b98a_f08f96ce2f2c.slice/crio-dfe2a2948e9b40fbb8b61b6793e67522c90a629bced66e064b6607008b2f4fa4 WatchSource:0}: Error finding container dfe2a2948e9b40fbb8b61b6793e67522c90a629bced66e064b6607008b2f4fa4: Status 404 returned error can't find the container with id dfe2a2948e9b40fbb8b61b6793e67522c90a629bced66e064b6607008b2f4fa4 Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.656984 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.657045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.657055 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.657074 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.657087 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.760647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.760696 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.760708 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.760726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.760737 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.838544 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.838624 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:36 crc kubenswrapper[4766]: E1209 03:12:36.838668 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:36 crc kubenswrapper[4766]: E1209 03:12:36.838878 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.839203 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:36 crc kubenswrapper[4766]: E1209 03:12:36.839388 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.864272 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.864324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.864334 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.864353 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.864366 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.913075 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/1.log" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.914027 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/0.log" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.920334 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c" exitCode=1 Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.920483 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.920569 4766 scope.go:117] "RemoveContainer" containerID="7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.922590 4766 scope.go:117] "RemoveContainer" containerID="c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c" Dec 09 03:12:36 crc kubenswrapper[4766]: E1209 03:12:36.923132 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.924761 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" event={"ID":"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c","Type":"ContainerStarted","Data":"bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.924908 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" event={"ID":"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c","Type":"ContainerStarted","Data":"dfe2a2948e9b40fbb8b61b6793e67522c90a629bced66e064b6607008b2f4fa4"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.949128 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.967853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.967901 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.967912 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.967931 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.967943 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:36Z","lastTransitionTime":"2025-12-09T03:12:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.969472 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:36 crc kubenswrapper[4766]: I1209 03:12:36.986360 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.001419 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:36Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.018639 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.033620 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.067328 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.070543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.070583 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.070600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.070625 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.070640 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:37Z","lastTransitionTime":"2025-12-09T03:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.095992 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:33Z\\\",\\\"message\\\":\\\"val\\\\nI1209 03:12:33.468007 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1209 03:12:33.468037 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1209 03:12:33.468447 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:12:33.468454 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1209 03:12:33.468464 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:12:33.468518 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:12:33.468540 6072 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:12:33.468545 6072 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:12:33.468555 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:12:33.468560 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:12:33.468574 6072 factory.go:656] Stopping watch factory\\\\nI1209 03:12:33.468592 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1209 03:12:33.468612 6072 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:12:33.468622 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:12:33.468628 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:12:33.468634 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\":map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 03:12:36.195961 6210 lb_config.go:1031] Cluster endpoints for openshift-cluster-version/cluster-version-operator for network=default are: map[]\\\\nI1209 03:12:36.195994 6210 services_controller.go:443] Built service openshift-cluster-version/cluster-version-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.182\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9099, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 03:12:36.195958 6210 services_controller.go:356] Processing sync for service openshift-config-operator/metrics for network=default\\\\nI1209 03:12:36.196018 6210 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1209 03:12:36.196030 6210 services_controller.go:444] Built service openshift-cluster-version/cluster-version-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 03:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.115529 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.131629 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.145401 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.156087 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.167842 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.173187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.173266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.173281 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.173304 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.173320 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:37Z","lastTransitionTime":"2025-12-09T03:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.182310 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.194846 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.277009 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.277073 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.277089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.277113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.277128 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:37Z","lastTransitionTime":"2025-12-09T03:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.380031 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.380093 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.380109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.380136 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.380156 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:37Z","lastTransitionTime":"2025-12-09T03:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.490780 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.490861 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.490877 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.490917 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.490932 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:37Z","lastTransitionTime":"2025-12-09T03:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.593400 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.593473 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.593491 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.593515 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.593531 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:37Z","lastTransitionTime":"2025-12-09T03:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.697549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.697603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.697614 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.697639 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.697660 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:37Z","lastTransitionTime":"2025-12-09T03:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.801192 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.801266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.801276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.801296 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.801308 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:37Z","lastTransitionTime":"2025-12-09T03:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.904504 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.904564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.904579 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.904604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.904617 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:37Z","lastTransitionTime":"2025-12-09T03:12:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.932468 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" event={"ID":"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c","Type":"ContainerStarted","Data":"ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665"} Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.934921 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/1.log" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.939851 4766 scope.go:117] "RemoveContainer" containerID="c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c" Dec 09 03:12:37 crc kubenswrapper[4766]: E1209 03:12:37.940325 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.950322 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:37 crc kubenswrapper[4766]: I1209 03:12:37.973332 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:37.999768 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:37Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.010307 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.010413 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.010443 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.010497 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.010521 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.024757 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.039413 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.054281 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.070359 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.085818 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.100408 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.114494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.114558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.114582 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.114615 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.114638 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.118173 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.135535 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.152134 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.171809 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.199104 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff2cc602a4cc0091b02593d4894754fd909d6036556f5ad60129feb8d985da9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:33Z\\\",\\\"message\\\":\\\"val\\\\nI1209 03:12:33.468007 6072 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1209 03:12:33.468037 6072 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1209 03:12:33.468447 6072 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:12:33.468454 6072 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1209 03:12:33.468464 6072 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:12:33.468518 6072 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:12:33.468540 6072 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:12:33.468545 6072 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:12:33.468555 6072 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:12:33.468560 6072 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:12:33.468574 6072 factory.go:656] Stopping watch factory\\\\nI1209 03:12:33.468592 6072 ovnkube.go:599] Stopped ovnkube\\\\nI1209 03:12:33.468612 6072 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:12:33.468622 6072 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:12:33.468628 6072 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:12:33.468634 6072 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\":map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 03:12:36.195961 6210 lb_config.go:1031] Cluster endpoints for openshift-cluster-version/cluster-version-operator for network=default are: map[]\\\\nI1209 03:12:36.195994 6210 services_controller.go:443] Built service openshift-cluster-version/cluster-version-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.182\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9099, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 03:12:36.195958 6210 services_controller.go:356] Processing sync for service openshift-config-operator/metrics for network=default\\\\nI1209 03:12:36.196018 6210 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1209 03:12:36.196030 6210 services_controller.go:444] Built service openshift-cluster-version/cluster-version-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 03:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.208139 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-z6qth"] Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.209294 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:38 crc kubenswrapper[4766]: E1209 03:12:38.209435 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.217655 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.217699 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.217715 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.217736 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.217753 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.218414 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.240242 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.260595 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.281084 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.300194 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.300176 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.300266 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sbsr\" (UniqueName: \"kubernetes.io/projected/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-kube-api-access-9sbsr\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.321036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.321106 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.321120 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.321143 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.321158 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.323349 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.346099 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.371994 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\":map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 03:12:36.195961 6210 lb_config.go:1031] Cluster endpoints for openshift-cluster-version/cluster-version-operator for network=default are: map[]\\\\nI1209 03:12:36.195994 6210 services_controller.go:443] Built service openshift-cluster-version/cluster-version-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.182\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9099, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 03:12:36.195958 6210 services_controller.go:356] Processing sync for service openshift-config-operator/metrics for network=default\\\\nI1209 03:12:36.196018 6210 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1209 03:12:36.196030 6210 services_controller.go:444] Built service openshift-cluster-version/cluster-version-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 03:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.387845 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.401455 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.401529 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sbsr\" (UniqueName: \"kubernetes.io/projected/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-kube-api-access-9sbsr\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:38 crc kubenswrapper[4766]: E1209 03:12:38.401698 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:38 crc kubenswrapper[4766]: E1209 03:12:38.401837 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs podName:5c9eb693-99eb-4b02-b33a-26d506eeb3f1 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:38.901809903 +0000 UTC m=+40.611115419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs") pod "network-metrics-daemon-z6qth" (UID: "5c9eb693-99eb-4b02-b33a-26d506eeb3f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.406708 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.421513 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sbsr\" (UniqueName: \"kubernetes.io/projected/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-kube-api-access-9sbsr\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.423730 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.424515 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.424756 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.424781 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.424816 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.424837 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.445609 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.461015 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.478531 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.497453 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.513610 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.527987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.528110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.528133 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.528154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.528171 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.533253 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.630629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.630692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.630708 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.630727 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.630739 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.733059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.733107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.733116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.733132 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.733143 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.836506 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.836588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.836604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.836634 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.836675 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.838854 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.839013 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.838931 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:38 crc kubenswrapper[4766]: E1209 03:12:38.839303 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:38 crc kubenswrapper[4766]: E1209 03:12:38.839372 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:38 crc kubenswrapper[4766]: E1209 03:12:38.839436 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.854172 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.869341 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.884304 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.899131 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.905933 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:38 crc kubenswrapper[4766]: E1209 03:12:38.906288 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:38 crc kubenswrapper[4766]: E1209 03:12:38.906445 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs podName:5c9eb693-99eb-4b02-b33a-26d506eeb3f1 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:39.906409145 +0000 UTC m=+41.615714601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs") pod "network-metrics-daemon-z6qth" (UID: "5c9eb693-99eb-4b02-b33a-26d506eeb3f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.913407 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.930435 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.938750 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.938796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.938807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.938824 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.938838 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:38Z","lastTransitionTime":"2025-12-09T03:12:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.948678 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.968359 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\":map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 03:12:36.195961 6210 lb_config.go:1031] Cluster endpoints for openshift-cluster-version/cluster-version-operator for network=default are: map[]\\\\nI1209 03:12:36.195994 6210 services_controller.go:443] Built service openshift-cluster-version/cluster-version-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.182\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9099, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 03:12:36.195958 6210 services_controller.go:356] Processing sync for service openshift-config-operator/metrics for network=default\\\\nI1209 03:12:36.196018 6210 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1209 03:12:36.196030 6210 services_controller.go:444] Built service openshift-cluster-version/cluster-version-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 03:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:38 crc kubenswrapper[4766]: I1209 03:12:38.981683 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.003800 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.018025 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.038328 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.041913 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.041960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.041979 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.042010 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.042028 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.052672 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.065364 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.081549 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.100096 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.144781 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.144839 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.144849 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.144866 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.144878 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.248409 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.248488 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.248507 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.248540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.248558 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.353968 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.354040 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.354058 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.354088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.354108 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.457505 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.457845 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.457986 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.458107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.458197 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.563011 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.563095 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.563116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.563149 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.563174 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.666810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.666888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.666908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.666938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.666962 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.770358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.770439 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.770468 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.770507 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.770535 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.838686 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:39 crc kubenswrapper[4766]: E1209 03:12:39.838921 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.874202 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.874315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.874335 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.874363 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.874384 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.917817 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:39 crc kubenswrapper[4766]: E1209 03:12:39.918051 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:39 crc kubenswrapper[4766]: E1209 03:12:39.918128 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs podName:5c9eb693-99eb-4b02-b33a-26d506eeb3f1 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:41.918104214 +0000 UTC m=+43.627409650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs") pod "network-metrics-daemon-z6qth" (UID: "5c9eb693-99eb-4b02-b33a-26d506eeb3f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.978350 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.978405 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.978420 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.978444 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:39 crc kubenswrapper[4766]: I1209 03:12:39.978462 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:39Z","lastTransitionTime":"2025-12-09T03:12:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.083155 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.083289 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.083308 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.083343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.083371 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.186765 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.186832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.186853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.186884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.186905 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.290538 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.290579 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.290588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.290604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.290616 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.395044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.395093 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.395103 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.395126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.395139 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.494910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.495018 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.495082 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.495155 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.495184 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: E1209 03:12:40.516667 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:40Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.523459 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.523534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.523549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.523600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.523611 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: E1209 03:12:40.545105 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:40Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.551314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.551374 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.551394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.551421 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.551442 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: E1209 03:12:40.572801 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:40Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.577691 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.577759 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.577780 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.577810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.577830 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: E1209 03:12:40.601823 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:40Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.607838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.607900 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.607912 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.607935 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.607948 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: E1209 03:12:40.627190 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:40Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:40 crc kubenswrapper[4766]: E1209 03:12:40.627443 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.630164 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.630300 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.630323 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.630351 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.630374 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.734948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.735012 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.735034 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.735062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.735084 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.838354 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.838400 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.838466 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:40 crc kubenswrapper[4766]: E1209 03:12:40.838515 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:40 crc kubenswrapper[4766]: E1209 03:12:40.838686 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:40 crc kubenswrapper[4766]: E1209 03:12:40.838854 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.840164 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.840283 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.840315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.840350 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.840373 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.944359 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.944838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.945081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.945303 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:40 crc kubenswrapper[4766]: I1209 03:12:40.945450 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:40Z","lastTransitionTime":"2025-12-09T03:12:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.050089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.050191 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.050269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.050301 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.050320 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.154119 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.154188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.154249 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.154304 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.154405 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.258572 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.258655 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.258676 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.258706 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.258727 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.362586 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.362643 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.362667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.362701 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.362724 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.465756 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.465807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.465817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.465834 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.465845 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.569253 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.569337 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.569358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.569391 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.569417 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.673575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.673661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.673681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.673719 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.673749 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.777338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.777420 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.777446 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.777478 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.777499 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.838502 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:41 crc kubenswrapper[4766]: E1209 03:12:41.838747 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.882466 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.882536 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.882562 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.882596 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.882623 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.944618 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:41 crc kubenswrapper[4766]: E1209 03:12:41.944948 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:41 crc kubenswrapper[4766]: E1209 03:12:41.945093 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs podName:5c9eb693-99eb-4b02-b33a-26d506eeb3f1 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:45.945055927 +0000 UTC m=+47.654361393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs") pod "network-metrics-daemon-z6qth" (UID: "5c9eb693-99eb-4b02-b33a-26d506eeb3f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.986680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.986761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.986796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.986829 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:41 crc kubenswrapper[4766]: I1209 03:12:41.986848 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:41Z","lastTransitionTime":"2025-12-09T03:12:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.090373 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.090454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.090480 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.090515 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.090540 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:42Z","lastTransitionTime":"2025-12-09T03:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.193859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.193919 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.193942 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.193971 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.193990 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:42Z","lastTransitionTime":"2025-12-09T03:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.297519 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.297627 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.297656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.297736 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.297770 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:42Z","lastTransitionTime":"2025-12-09T03:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.403800 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.403848 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.403859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.403877 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.403886 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:42Z","lastTransitionTime":"2025-12-09T03:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.507344 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.507421 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.507434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.507457 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.507472 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:42Z","lastTransitionTime":"2025-12-09T03:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.610114 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.610167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.610185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.610232 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.610254 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:42Z","lastTransitionTime":"2025-12-09T03:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.713578 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.713672 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.713700 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.713733 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.713765 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:42Z","lastTransitionTime":"2025-12-09T03:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.822570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.822971 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.822985 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.823008 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.823019 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:42Z","lastTransitionTime":"2025-12-09T03:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.838316 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.838387 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.838759 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:42 crc kubenswrapper[4766]: E1209 03:12:42.838683 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:42 crc kubenswrapper[4766]: E1209 03:12:42.839020 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:42 crc kubenswrapper[4766]: E1209 03:12:42.838928 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.927190 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.927255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.927267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.927283 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:42 crc kubenswrapper[4766]: I1209 03:12:42.927294 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:42Z","lastTransitionTime":"2025-12-09T03:12:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.030916 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.031009 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.031032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.031070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.031091 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.134093 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.134188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.134263 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.134306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.134326 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.239695 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.239780 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.239807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.239838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.239863 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.344379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.344445 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.344465 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.344494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.344520 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.448185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.448313 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.448341 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.448387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.448415 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.551734 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.551818 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.551832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.551862 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.551879 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.655741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.655814 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.655829 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.655853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.655872 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.760054 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.760126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.760144 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.760172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.760191 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.838322 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:43 crc kubenswrapper[4766]: E1209 03:12:43.838654 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.863625 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.863688 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.863710 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.863736 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.863758 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.966810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.966863 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.966875 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.966894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:43 crc kubenswrapper[4766]: I1209 03:12:43.966907 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:43Z","lastTransitionTime":"2025-12-09T03:12:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.070607 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.070662 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.070679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.070705 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.070723 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:44Z","lastTransitionTime":"2025-12-09T03:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.174740 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.174806 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.174824 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.174850 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.174868 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:44Z","lastTransitionTime":"2025-12-09T03:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.279476 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.279563 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.279593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.279633 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.279691 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:44Z","lastTransitionTime":"2025-12-09T03:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.383471 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.383546 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.383565 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.383596 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.383617 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:44Z","lastTransitionTime":"2025-12-09T03:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.486943 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.487026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.487044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.487082 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.487107 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:44Z","lastTransitionTime":"2025-12-09T03:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.590960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.591042 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.591062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.591093 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.591114 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:44Z","lastTransitionTime":"2025-12-09T03:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.695019 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.695097 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.695120 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.695156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.695187 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:44Z","lastTransitionTime":"2025-12-09T03:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.799150 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.799277 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.799297 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.799324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.799343 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:44Z","lastTransitionTime":"2025-12-09T03:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.839202 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.839392 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:44 crc kubenswrapper[4766]: E1209 03:12:44.839533 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:44 crc kubenswrapper[4766]: E1209 03:12:44.839642 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.839279 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:44 crc kubenswrapper[4766]: E1209 03:12:44.839798 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.906786 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.907542 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.907586 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.907634 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:44 crc kubenswrapper[4766]: I1209 03:12:44.907666 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:44Z","lastTransitionTime":"2025-12-09T03:12:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.010786 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.010873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.010894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.010937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.010977 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.114269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.114340 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.114355 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.114381 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.114397 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.217596 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.218315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.218426 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.218541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.218627 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.321088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.321122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.321131 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.321147 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.321157 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.424791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.424888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.424916 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.424951 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.424974 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.527755 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.527830 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.527855 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.527890 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.527914 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.632991 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.633057 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.633075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.633102 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.633122 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.737021 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.737108 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.737133 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.737164 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.737187 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.838557 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:45 crc kubenswrapper[4766]: E1209 03:12:45.838919 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.841518 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.841770 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.841993 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.842199 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.842514 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.946161 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.946279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.946316 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.946350 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.946370 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:45Z","lastTransitionTime":"2025-12-09T03:12:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:45 crc kubenswrapper[4766]: I1209 03:12:45.995035 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:45 crc kubenswrapper[4766]: E1209 03:12:45.995323 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:45 crc kubenswrapper[4766]: E1209 03:12:45.995437 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs podName:5c9eb693-99eb-4b02-b33a-26d506eeb3f1 nodeName:}" failed. No retries permitted until 2025-12-09 03:12:53.99539159 +0000 UTC m=+55.704697056 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs") pod "network-metrics-daemon-z6qth" (UID: "5c9eb693-99eb-4b02-b33a-26d506eeb3f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.050160 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.050275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.050302 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.050325 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.050338 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.153863 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.153971 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.154006 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.154039 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.154064 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.258070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.258136 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.258153 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.258183 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.258205 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.361106 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.361174 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.361195 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.361267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.361287 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.465185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.465287 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.465308 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.465334 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.465354 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.568785 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.568855 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.568876 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.568904 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.568925 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.671174 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.671260 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.671279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.671308 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.671329 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.774912 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.775005 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.775026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.775063 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.775083 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.839145 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.839263 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.839346 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:46 crc kubenswrapper[4766]: E1209 03:12:46.839390 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:46 crc kubenswrapper[4766]: E1209 03:12:46.839459 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:46 crc kubenswrapper[4766]: E1209 03:12:46.839551 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.879673 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.879726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.879739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.879762 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.879774 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.982851 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.982913 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.982930 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.982959 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:46 crc kubenswrapper[4766]: I1209 03:12:46.982977 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:46Z","lastTransitionTime":"2025-12-09T03:12:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.086389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.086493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.086513 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.086540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.086561 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:47Z","lastTransitionTime":"2025-12-09T03:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.191024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.191089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.191115 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.191149 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.191173 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:47Z","lastTransitionTime":"2025-12-09T03:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.296980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.297077 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.297105 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.297138 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.297164 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:47Z","lastTransitionTime":"2025-12-09T03:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.400812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.400923 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.400946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.400982 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.401006 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:47Z","lastTransitionTime":"2025-12-09T03:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.504724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.504798 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.504822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.504856 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.504881 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:47Z","lastTransitionTime":"2025-12-09T03:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.608094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.608154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.608164 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.608186 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.608199 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:47Z","lastTransitionTime":"2025-12-09T03:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.712128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.712185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.712201 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.712246 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.712264 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:47Z","lastTransitionTime":"2025-12-09T03:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.815874 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.815949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.815968 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.816002 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.816023 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:47Z","lastTransitionTime":"2025-12-09T03:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.838780 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:47 crc kubenswrapper[4766]: E1209 03:12:47.839107 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.919611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.919692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.919712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.919741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:47 crc kubenswrapper[4766]: I1209 03:12:47.919763 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:47Z","lastTransitionTime":"2025-12-09T03:12:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.023070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.023157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.023180 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.023240 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.023262 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.126657 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.126731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.126759 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.126793 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.126822 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.230906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.230982 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.231008 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.231045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.231087 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.334728 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.334819 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.334842 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.334878 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.334903 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.438611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.438705 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.438731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.438773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.438799 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.542729 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.542806 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.542822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.542844 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.542858 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.629132 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.629534 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:13:20.629466132 +0000 UTC m=+82.338771578 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.629689 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.629887 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.630024 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:13:20.629994007 +0000 UTC m=+82.339299473 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.648159 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.648202 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.648230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.648251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.648261 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.730457 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.730560 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.730604 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.730716 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.730784 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.730808 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.730851 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.730898 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:13:20.73086415 +0000 UTC m=+82.440169606 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.730937 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 03:13:20.730919862 +0000 UTC m=+82.440225328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.730956 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.730972 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.730983 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.731029 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 03:13:20.731006874 +0000 UTC m=+82.440312540 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.751848 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.751892 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.751928 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.751947 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.751961 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.838557 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.838549 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.838574 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.838752 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.838833 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:48 crc kubenswrapper[4766]: E1209 03:12:48.839094 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.840772 4766 scope.go:117] "RemoveContainer" containerID="c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.853814 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.853857 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.853866 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.853879 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.853893 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.864700 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:48Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.880208 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:48Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.900625 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:48Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.915276 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:48Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.936713 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:48Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.952952 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:48Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.960511 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.960557 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.960571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.960595 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.960608 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:48Z","lastTransitionTime":"2025-12-09T03:12:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:48 crc kubenswrapper[4766]: I1209 03:12:48.980003 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:48Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.002900 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\":map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 03:12:36.195961 6210 lb_config.go:1031] Cluster endpoints for openshift-cluster-version/cluster-version-operator for network=default are: map[]\\\\nI1209 03:12:36.195994 6210 services_controller.go:443] Built service openshift-cluster-version/cluster-version-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.182\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9099, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 03:12:36.195958 6210 services_controller.go:356] Processing sync for service openshift-config-operator/metrics for network=default\\\\nI1209 03:12:36.196018 6210 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1209 03:12:36.196030 6210 services_controller.go:444] Built service openshift-cluster-version/cluster-version-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 03:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:48Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.017193 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.030495 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.055919 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.064547 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.064603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.064622 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.064642 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.064655 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.073068 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.087012 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.102002 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.116039 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.127003 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.167733 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.167790 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.167804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.168154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.168189 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.271739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.271796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.271815 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.271836 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.271851 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.374301 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.374339 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.374347 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.374366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.374374 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.477794 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.477859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.477875 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.477898 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.477913 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.580969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.581050 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.581067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.581093 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.581114 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.684483 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.684555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.684571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.684598 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.684616 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.788632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.788684 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.788693 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.788709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.788720 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.839035 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:49 crc kubenswrapper[4766]: E1209 03:12:49.839356 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.892043 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.892104 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.892113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.892130 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.892141 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.994781 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.994839 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.994858 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.994887 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.994907 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:49Z","lastTransitionTime":"2025-12-09T03:12:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.996013 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/2.log" Dec 09 03:12:49 crc kubenswrapper[4766]: I1209 03:12:49.997292 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/1.log" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.000719 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab" exitCode=1 Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.000792 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.000871 4766 scope.go:117] "RemoveContainer" containerID="c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.002320 4766 scope.go:117] "RemoveContainer" containerID="aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab" Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.002664 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.022875 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.046894 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.063290 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.081282 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.098309 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.098475 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.098513 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.098534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.098563 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.098584 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.120573 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.139579 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.169557 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\":map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 03:12:36.195961 6210 lb_config.go:1031] Cluster endpoints for openshift-cluster-version/cluster-version-operator for network=default are: map[]\\\\nI1209 03:12:36.195994 6210 services_controller.go:443] Built service openshift-cluster-version/cluster-version-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.182\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9099, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 03:12:36.195958 6210 services_controller.go:356] Processing sync for service openshift-config-operator/metrics for network=default\\\\nI1209 03:12:36.196018 6210 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1209 03:12:36.196030 6210 services_controller.go:444] Built service openshift-cluster-version/cluster-version-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 03:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:49Z\\\",\\\"message\\\":\\\"d to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z]\\\\nI1209 03:12:49.722948 6401 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722976 6401 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722938 6401 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.183707 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.199386 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.201722 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.201789 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.201819 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.201846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.201860 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.215360 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.231477 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.251957 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.270525 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.284891 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.301779 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.305576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.305631 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.305641 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.305661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.305673 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.408763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.408820 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.408838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.408872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.408887 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.511895 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.511963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.511975 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.512001 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.512014 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.615578 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.615650 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.615663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.615692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.615707 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.719263 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.719332 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.719350 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.719381 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.719398 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.823101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.823155 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.823167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.823189 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.823203 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.835544 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.835616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.835633 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.835656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.835672 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.839009 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.839044 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.839247 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.839359 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.839596 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.839767 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.855648 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.862458 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.862526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.862546 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.862575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.862596 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.887329 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.892876 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.892917 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.892929 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.892949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.892964 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.909805 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.914901 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.914944 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.914954 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.914974 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.914988 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.930785 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.935659 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.935722 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.935739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.935769 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.935786 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.951869 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:50Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:50 crc kubenswrapper[4766]: E1209 03:12:50.952258 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.954512 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.954551 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.954562 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.954588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:50 crc kubenswrapper[4766]: I1209 03:12:50.954604 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:50Z","lastTransitionTime":"2025-12-09T03:12:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.008576 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/2.log" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.058526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.058576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.058590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.058618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.058634 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.161673 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.161727 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.161739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.161758 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.161770 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.264900 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.264942 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.264952 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.264969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.264982 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.369055 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.369136 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.369157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.369191 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.369246 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.472710 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.472806 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.472832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.472868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.472894 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.576011 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.576084 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.576109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.576149 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.576175 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.679088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.679128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.679139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.679154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.679164 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.782502 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.782595 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.782633 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.782661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.782681 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.838286 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:51 crc kubenswrapper[4766]: E1209 03:12:51.838439 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.886390 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.886474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.886500 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.886534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.886560 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.990714 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.990822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.990846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.990893 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:51 crc kubenswrapper[4766]: I1209 03:12:51.990918 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:51Z","lastTransitionTime":"2025-12-09T03:12:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.094529 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.094609 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.094633 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.094670 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.094696 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:52Z","lastTransitionTime":"2025-12-09T03:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.198531 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.198627 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.198663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.198699 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.198723 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:52Z","lastTransitionTime":"2025-12-09T03:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.303393 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.303467 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.303491 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.303525 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.303547 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:52Z","lastTransitionTime":"2025-12-09T03:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.407191 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.407274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.407292 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.407316 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.407334 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:52Z","lastTransitionTime":"2025-12-09T03:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.510354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.510414 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.510424 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.510444 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.510464 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:52Z","lastTransitionTime":"2025-12-09T03:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.615120 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.615276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.615300 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.615328 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.615347 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:52Z","lastTransitionTime":"2025-12-09T03:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.719981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.720056 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.720080 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.720115 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.720138 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:52Z","lastTransitionTime":"2025-12-09T03:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.823949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.824036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.824061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.824095 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.824119 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:52Z","lastTransitionTime":"2025-12-09T03:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.838640 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.838658 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:52 crc kubenswrapper[4766]: E1209 03:12:52.838877 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:52 crc kubenswrapper[4766]: E1209 03:12:52.838966 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.838677 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:52 crc kubenswrapper[4766]: E1209 03:12:52.839119 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.928155 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.928254 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.928272 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.928335 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:52 crc kubenswrapper[4766]: I1209 03:12:52.928365 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:52Z","lastTransitionTime":"2025-12-09T03:12:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.030987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.031073 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.031092 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.031118 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.031138 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.052360 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.071707 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.079669 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.099189 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.120230 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.134617 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.134681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.134699 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.134727 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.134762 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.150784 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.175187 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.211333 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\":map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 03:12:36.195961 6210 lb_config.go:1031] Cluster endpoints for openshift-cluster-version/cluster-version-operator for network=default are: map[]\\\\nI1209 03:12:36.195994 6210 services_controller.go:443] Built service openshift-cluster-version/cluster-version-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.182\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9099, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 03:12:36.195958 6210 services_controller.go:356] Processing sync for service openshift-config-operator/metrics for network=default\\\\nI1209 03:12:36.196018 6210 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1209 03:12:36.196030 6210 services_controller.go:444] Built service openshift-cluster-version/cluster-version-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 03:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:49Z\\\",\\\"message\\\":\\\"d to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z]\\\\nI1209 03:12:49.722948 6401 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722976 6401 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722938 6401 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.232292 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.238296 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.238370 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.238392 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.238423 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.238446 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.253785 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.279911 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.301362 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.316788 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.332532 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.341743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.341809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.341856 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.341892 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.341917 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.355553 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.367610 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.381514 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.397858 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:53Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.445267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.445484 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.445584 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.445733 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.445833 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.549349 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.549426 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.549452 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.549484 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.549508 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.653275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.653354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.653434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.653472 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.653501 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.757648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.758156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.758453 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.758616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.758737 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.838186 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:53 crc kubenswrapper[4766]: E1209 03:12:53.838943 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.862750 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.863075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.863160 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.863272 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.863372 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.967604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.967685 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.967707 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.967735 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.967757 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:53Z","lastTransitionTime":"2025-12-09T03:12:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:53 crc kubenswrapper[4766]: I1209 03:12:53.996189 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:53 crc kubenswrapper[4766]: E1209 03:12:53.996564 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:53 crc kubenswrapper[4766]: E1209 03:12:53.996774 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs podName:5c9eb693-99eb-4b02-b33a-26d506eeb3f1 nodeName:}" failed. No retries permitted until 2025-12-09 03:13:09.99664504 +0000 UTC m=+71.705950506 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs") pod "network-metrics-daemon-z6qth" (UID: "5c9eb693-99eb-4b02-b33a-26d506eeb3f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.071910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.072428 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.072705 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.072880 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.073019 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:54Z","lastTransitionTime":"2025-12-09T03:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.177077 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.177387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.177652 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.177813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.177941 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:54Z","lastTransitionTime":"2025-12-09T03:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.282012 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.282092 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.282116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.282189 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.282249 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:54Z","lastTransitionTime":"2025-12-09T03:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.387019 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.387115 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.387137 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.387170 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.387192 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:54Z","lastTransitionTime":"2025-12-09T03:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.491292 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.491390 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.491418 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.491459 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.491487 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:54Z","lastTransitionTime":"2025-12-09T03:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.596028 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.596109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.596131 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.596161 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.596182 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:54Z","lastTransitionTime":"2025-12-09T03:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.700736 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.700807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.700825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.700855 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.700882 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:54Z","lastTransitionTime":"2025-12-09T03:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.804235 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.804285 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.804298 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.804319 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.804334 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:54Z","lastTransitionTime":"2025-12-09T03:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.838545 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.838748 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:54 crc kubenswrapper[4766]: E1209 03:12:54.838763 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:54 crc kubenswrapper[4766]: E1209 03:12:54.838900 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.838555 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:54 crc kubenswrapper[4766]: E1209 03:12:54.839014 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.907403 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.907521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.907542 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.907576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:54 crc kubenswrapper[4766]: I1209 03:12:54.907599 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:54Z","lastTransitionTime":"2025-12-09T03:12:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.010040 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.010084 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.010095 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.010111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.010122 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.114386 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.114462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.114481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.114511 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.114531 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.218379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.218458 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.218480 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.218515 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.218537 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.321062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.321132 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.321150 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.321179 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.321199 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.423352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.423401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.423413 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.423431 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.423465 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.525977 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.526033 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.526044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.526061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.526070 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.629278 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.629407 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.629432 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.629474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.629502 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.732517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.732592 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.732609 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.732635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.732651 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.835653 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.835745 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.835761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.835782 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.835819 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.838910 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:55 crc kubenswrapper[4766]: E1209 03:12:55.839027 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.938585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.938626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.938637 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.938652 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:55 crc kubenswrapper[4766]: I1209 03:12:55.938663 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:55Z","lastTransitionTime":"2025-12-09T03:12:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.040660 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.040703 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.040716 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.040731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.040742 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.142910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.142962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.142974 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.142988 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.142998 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.245850 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.245885 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.245897 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.245914 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.245928 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.348187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.348244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.348254 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.348269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.348278 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.451064 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.451109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.451118 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.451131 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.451142 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.552962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.553078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.553098 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.553116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.553127 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.655807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.655887 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.655910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.655945 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.655969 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.759351 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.759394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.759405 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.759424 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.759434 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.838882 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.839003 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.839108 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:56 crc kubenswrapper[4766]: E1209 03:12:56.839031 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:56 crc kubenswrapper[4766]: E1209 03:12:56.839276 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:56 crc kubenswrapper[4766]: E1209 03:12:56.839525 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.861796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.861833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.861845 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.861860 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.861873 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.964753 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.964965 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.965003 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.965036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:56 crc kubenswrapper[4766]: I1209 03:12:56.965055 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:56Z","lastTransitionTime":"2025-12-09T03:12:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.068873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.068927 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.068949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.068977 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.068989 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:57Z","lastTransitionTime":"2025-12-09T03:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.172692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.172800 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.172813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.172831 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.172841 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:57Z","lastTransitionTime":"2025-12-09T03:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.277045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.277143 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.277165 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.277244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.277269 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:57Z","lastTransitionTime":"2025-12-09T03:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.381375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.381491 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.381548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.381632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.381664 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:57Z","lastTransitionTime":"2025-12-09T03:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.486087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.486182 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.486201 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.486282 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.486309 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:57Z","lastTransitionTime":"2025-12-09T03:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.589789 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.589905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.589924 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.589955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.589976 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:57Z","lastTransitionTime":"2025-12-09T03:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.693453 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.693526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.693545 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.693573 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.693595 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:57Z","lastTransitionTime":"2025-12-09T03:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.796638 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.796753 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.796784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.796825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.796852 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:57Z","lastTransitionTime":"2025-12-09T03:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.838463 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:57 crc kubenswrapper[4766]: E1209 03:12:57.838749 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.901559 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.901633 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.901656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.901690 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:57 crc kubenswrapper[4766]: I1209 03:12:57.901714 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:57Z","lastTransitionTime":"2025-12-09T03:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.006415 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.006500 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.006520 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.006558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.006579 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.110344 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.110424 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.110436 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.110459 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.110474 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.213982 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.214034 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.214043 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.214059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.214068 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.316794 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.316857 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.316868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.316893 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.316911 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.420523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.420567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.420576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.420594 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.420609 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.524680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.524727 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.524736 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.524752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.524763 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.627121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.627190 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.627201 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.627236 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.627245 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.729572 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.729607 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.729616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.729630 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.729640 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.832792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.832839 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.832849 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.832866 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.832878 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.838413 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.838450 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:12:58 crc kubenswrapper[4766]: E1209 03:12:58.838658 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.838714 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:12:58 crc kubenswrapper[4766]: E1209 03:12:58.838816 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:12:58 crc kubenswrapper[4766]: E1209 03:12:58.838891 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.850609 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.870121 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.884429 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.899359 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.918132 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c152a60ec9da4481af0913cd056a89270152fae8c34df279645ef1041a52ad0c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"message\\\":\\\":map[10.217.4.153:5443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5e50827b-d271-442b-b8a7-7f33b2cd6b11}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 03:12:36.195961 6210 lb_config.go:1031] Cluster endpoints for openshift-cluster-version/cluster-version-operator for network=default are: map[]\\\\nI1209 03:12:36.195994 6210 services_controller.go:443] Built service openshift-cluster-version/cluster-version-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.182\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9099, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1209 03:12:36.195958 6210 services_controller.go:356] Processing sync for service openshift-config-operator/metrics for network=default\\\\nI1209 03:12:36.196018 6210 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1209 03:12:36.196030 6210 services_controller.go:444] Built service openshift-cluster-version/cluster-version-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1209 03:12:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:49Z\\\",\\\"message\\\":\\\"d to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z]\\\\nI1209 03:12:49.722948 6401 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722976 6401 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722938 6401 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.932976 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.935153 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.935244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.935255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.935270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.935281 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:58Z","lastTransitionTime":"2025-12-09T03:12:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.949769 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.963666 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.977329 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.989770 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:58 crc kubenswrapper[4766]: I1209 03:12:58.999460 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:58Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.012395 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:59Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.022455 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:59Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.033787 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:59Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.037537 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.037637 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.037663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.037698 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.037719 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.044262 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:59Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.056086 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:59Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.069692 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:59Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.140331 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.140557 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.140658 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.140769 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.140878 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.244382 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.244440 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.244460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.244488 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.244512 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.346948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.347075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.347087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.347131 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.347146 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.450142 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.450188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.450198 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.450236 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.450252 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.553647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.553715 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.553731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.553755 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.553771 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.656438 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.656539 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.656569 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.656603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.656632 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.759963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.760078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.760094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.760123 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.760140 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.838133 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:12:59 crc kubenswrapper[4766]: E1209 03:12:59.838311 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.866599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.866742 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.866797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.866842 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.866872 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.941368 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.942696 4766 scope.go:117] "RemoveContainer" containerID="aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab" Dec 09 03:12:59 crc kubenswrapper[4766]: E1209 03:12:59.943099 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.957827 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:59Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.971178 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.971548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.971662 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.971767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.971852 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:12:59Z","lastTransitionTime":"2025-12-09T03:12:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.972478 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:59Z is after 2025-08-24T17:21:41Z" Dec 09 03:12:59 crc kubenswrapper[4766]: I1209 03:12:59.996055 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:59Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.012186 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.026316 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.045722 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.067259 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.074472 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.074537 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.074554 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.074578 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.074594 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:00Z","lastTransitionTime":"2025-12-09T03:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.083626 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.096246 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.110795 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.124536 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.140183 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.160447 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:49Z\\\",\\\"message\\\":\\\"d to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z]\\\\nI1209 03:12:49.722948 6401 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722976 6401 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722938 6401 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.172946 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.177938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.178030 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.178053 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.178081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.178102 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:00Z","lastTransitionTime":"2025-12-09T03:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.188564 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.201438 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.221005 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:00Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.281176 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.281241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.281252 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.281267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.281278 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:00Z","lastTransitionTime":"2025-12-09T03:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.383986 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.384069 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.384086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.384113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.384128 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:00Z","lastTransitionTime":"2025-12-09T03:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.487750 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.487822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.487841 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.487868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.487884 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:00Z","lastTransitionTime":"2025-12-09T03:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.590839 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.590888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.590901 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.590919 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.590931 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:00Z","lastTransitionTime":"2025-12-09T03:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.694883 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.694964 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.694982 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.695011 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.696438 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:00Z","lastTransitionTime":"2025-12-09T03:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.800089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.800179 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.800203 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.800284 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.800310 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:00Z","lastTransitionTime":"2025-12-09T03:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.838827 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.838946 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:00 crc kubenswrapper[4766]: E1209 03:13:00.839073 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.838955 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:00 crc kubenswrapper[4766]: E1209 03:13:00.839165 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:00 crc kubenswrapper[4766]: E1209 03:13:00.839420 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.904346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.904416 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.904439 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.904473 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:00 crc kubenswrapper[4766]: I1209 03:13:00.904500 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:00Z","lastTransitionTime":"2025-12-09T03:13:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.007480 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.007564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.007586 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.007616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.007637 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.110674 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.110722 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.110739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.110771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.110783 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.214603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.214653 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.214662 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.214681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.214695 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.318550 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.318633 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.318653 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.318684 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.318703 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.338317 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.338408 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.338436 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.338476 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.338503 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: E1209 03:13:01.362800 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:01Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.369074 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.369318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.369356 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.369396 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.369425 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: E1209 03:13:01.393653 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:01Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.400423 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.400498 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.400530 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.400615 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.400637 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: E1209 03:13:01.419627 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:01Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.426277 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.426335 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.426348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.426374 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.426391 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: E1209 03:13:01.449113 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:01Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.454988 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.455045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.455056 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.455075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.455086 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: E1209 03:13:01.469885 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:01Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:01 crc kubenswrapper[4766]: E1209 03:13:01.470146 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.472275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.472337 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.472352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.472382 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.472399 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.576909 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.576978 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.577004 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.577042 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.577069 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.681379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.681428 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.681441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.681462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.681476 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.785041 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.785094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.785111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.785137 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.785157 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.838859 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:01 crc kubenswrapper[4766]: E1209 03:13:01.839031 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.889469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.889557 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.889956 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.890038 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.890420 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.995074 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.995159 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.995185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.995255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:01 crc kubenswrapper[4766]: I1209 03:13:01.995287 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:01Z","lastTransitionTime":"2025-12-09T03:13:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.097974 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.098044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.098062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.098091 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.098114 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:02Z","lastTransitionTime":"2025-12-09T03:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.202464 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.202518 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.202530 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.202552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.202567 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:02Z","lastTransitionTime":"2025-12-09T03:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.306545 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.306639 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.306658 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.306688 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.306708 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:02Z","lastTransitionTime":"2025-12-09T03:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.409575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.409623 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.409632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.409649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.409660 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:02Z","lastTransitionTime":"2025-12-09T03:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.512883 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.512970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.512982 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.513002 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.513014 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:02Z","lastTransitionTime":"2025-12-09T03:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.619086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.619141 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.619154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.619175 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.619186 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:02Z","lastTransitionTime":"2025-12-09T03:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.722070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.722134 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.722145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.722162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.722172 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:02Z","lastTransitionTime":"2025-12-09T03:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.831541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.831592 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.831604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.831622 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.831635 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:02Z","lastTransitionTime":"2025-12-09T03:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.839636 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.839635 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:02 crc kubenswrapper[4766]: E1209 03:13:02.839757 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:02 crc kubenswrapper[4766]: E1209 03:13:02.839906 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.839988 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:02 crc kubenswrapper[4766]: E1209 03:13:02.840063 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.934862 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.934936 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.934953 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.934980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:02 crc kubenswrapper[4766]: I1209 03:13:02.934998 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:02Z","lastTransitionTime":"2025-12-09T03:13:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.037291 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.037333 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.037342 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.037358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.037368 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.140111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.140152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.140162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.140180 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.140195 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.242969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.243006 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.243017 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.243032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.243041 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.345646 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.345686 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.345698 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.345715 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.345726 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.448671 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.448738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.448756 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.448782 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.448798 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.551244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.551287 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.551299 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.551315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.551348 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.653930 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.653974 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.653989 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.654004 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.654016 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.757406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.757469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.757481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.757500 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.757512 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.839425 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:03 crc kubenswrapper[4766]: E1209 03:13:03.839575 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.859603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.859643 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.859653 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.859667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.859677 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.962118 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.962156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.962168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.962184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:03 crc kubenswrapper[4766]: I1209 03:13:03.962197 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:03Z","lastTransitionTime":"2025-12-09T03:13:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.073288 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.073330 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.073343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.073366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.073381 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:04Z","lastTransitionTime":"2025-12-09T03:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.176398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.176441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.176463 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.176481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.176494 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:04Z","lastTransitionTime":"2025-12-09T03:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.279696 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.279758 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.279769 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.279790 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.279800 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:04Z","lastTransitionTime":"2025-12-09T03:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.382079 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.382121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.382135 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.382152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.382164 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:04Z","lastTransitionTime":"2025-12-09T03:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.484731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.484780 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.484793 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.484811 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.484847 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:04Z","lastTransitionTime":"2025-12-09T03:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.587924 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.587970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.587979 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.587999 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.588014 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:04Z","lastTransitionTime":"2025-12-09T03:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.691340 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.691397 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.691410 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.691429 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.691444 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:04Z","lastTransitionTime":"2025-12-09T03:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.794194 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.794277 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.794286 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.794343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.794354 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:04Z","lastTransitionTime":"2025-12-09T03:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.839108 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.839108 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:04 crc kubenswrapper[4766]: E1209 03:13:04.839323 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.839139 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:04 crc kubenswrapper[4766]: E1209 03:13:04.839398 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:04 crc kubenswrapper[4766]: E1209 03:13:04.839516 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.897534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.897583 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.897600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.897624 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:04 crc kubenswrapper[4766]: I1209 03:13:04.897643 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:04Z","lastTransitionTime":"2025-12-09T03:13:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.002766 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.002811 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.002825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.002851 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.002864 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.105505 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.105549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.105561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.105579 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.105591 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.207666 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.207720 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.207766 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.207784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.207796 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.309802 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.309846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.309857 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.309872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.309883 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.412566 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.412612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.412622 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.412640 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.412654 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.515111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.515158 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.515170 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.515191 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.515203 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.617739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.617796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.617807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.617825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.617837 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.719949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.720032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.720048 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.720070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.720084 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.823037 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.823083 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.823094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.823112 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.823127 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.838374 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:05 crc kubenswrapper[4766]: E1209 03:13:05.838513 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.926667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.926712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.926724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.926743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:05 crc kubenswrapper[4766]: I1209 03:13:05.926756 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:05Z","lastTransitionTime":"2025-12-09T03:13:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.029733 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.029784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.029795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.029812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.029822 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.132293 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.132334 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.132343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.132359 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.132368 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.236236 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.236291 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.236304 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.236323 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.236338 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.340126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.340230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.340251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.340273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.340284 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.443116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.443173 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.443183 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.443204 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.443230 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.546043 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.546089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.546108 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.546127 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.546166 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.650036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.650117 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.650129 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.650149 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.650163 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.753717 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.753781 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.753793 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.753814 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.753832 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.839078 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.839189 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.839189 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:06 crc kubenswrapper[4766]: E1209 03:13:06.839376 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:06 crc kubenswrapper[4766]: E1209 03:13:06.839523 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:06 crc kubenswrapper[4766]: E1209 03:13:06.839667 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.856314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.856345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.856354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.856367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.856376 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.958767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.958845 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.958865 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.958896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:06 crc kubenswrapper[4766]: I1209 03:13:06.958916 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:06Z","lastTransitionTime":"2025-12-09T03:13:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.061420 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.061509 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.061528 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.061565 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.061604 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.164927 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.164972 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.164982 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.164997 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.165007 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.267124 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.267179 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.267192 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.267233 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.267251 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.370345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.370394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.370403 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.370425 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.370437 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.473685 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.473734 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.473743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.473763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.473774 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.576581 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.576627 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.576639 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.576661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.576677 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.680045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.680106 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.680121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.680142 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.680161 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.782715 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.782767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.782777 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.782797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.782809 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.838795 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:07 crc kubenswrapper[4766]: E1209 03:13:07.839049 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.886139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.886193 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.886207 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.886245 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.886260 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.989184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.989318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.989339 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.989370 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:07 crc kubenswrapper[4766]: I1209 03:13:07.989394 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:07Z","lastTransitionTime":"2025-12-09T03:13:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.091777 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.091830 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.091843 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.091865 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.091876 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:08Z","lastTransitionTime":"2025-12-09T03:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.194679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.194741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.194754 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.194779 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.194795 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:08Z","lastTransitionTime":"2025-12-09T03:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.298244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.298297 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.298339 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.298375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.298391 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:08Z","lastTransitionTime":"2025-12-09T03:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.400887 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.400936 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.400948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.400968 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.400979 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:08Z","lastTransitionTime":"2025-12-09T03:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.503571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.503617 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.503629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.503649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.503662 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:08Z","lastTransitionTime":"2025-12-09T03:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.606574 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.606628 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.606639 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.606657 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.606668 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:08Z","lastTransitionTime":"2025-12-09T03:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.710204 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.710275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.710290 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.710313 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.710327 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:08Z","lastTransitionTime":"2025-12-09T03:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.813612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.813662 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.813672 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.813694 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.813710 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:08Z","lastTransitionTime":"2025-12-09T03:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.839269 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.839352 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:08 crc kubenswrapper[4766]: E1209 03:13:08.839447 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.839269 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:08 crc kubenswrapper[4766]: E1209 03:13:08.839534 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:08 crc kubenswrapper[4766]: E1209 03:13:08.839649 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.859057 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:08Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.880129 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:08Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.900358 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:08Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.917098 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.917141 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.917152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.917168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.917179 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:08Z","lastTransitionTime":"2025-12-09T03:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.917724 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:08Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.929620 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:08Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.948891 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:08Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.963771 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:08Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.978658 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:08Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:08 crc kubenswrapper[4766]: I1209 03:13:08.997173 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:49Z\\\",\\\"message\\\":\\\"d to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z]\\\\nI1209 03:12:49.722948 6401 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722976 6401 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722938 6401 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:08Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.007955 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:09Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.017952 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:09Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.019819 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.019868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.019881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.019912 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.019925 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.034273 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:09Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.049117 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:09Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.059428 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:09Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.071652 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:09Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.086489 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:09Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.099582 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:09Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.122938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.122987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.122998 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.123015 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.123027 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.225812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.225860 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.225874 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.225894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.225932 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.329335 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.329411 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.329422 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.329440 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.329451 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.432883 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.432934 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.432946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.432970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.432990 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.536418 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.536463 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.536474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.536494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.536508 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.639604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.639652 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.639661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.639681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.639691 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.742694 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.742771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.742784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.742811 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.742827 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.839165 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:09 crc kubenswrapper[4766]: E1209 03:13:09.839435 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.846165 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.846227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.846244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.846261 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.846273 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.949652 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.949708 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.949717 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.949734 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:09 crc kubenswrapper[4766]: I1209 03:13:09.949744 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:09Z","lastTransitionTime":"2025-12-09T03:13:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.013009 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:10 crc kubenswrapper[4766]: E1209 03:13:10.013205 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:13:10 crc kubenswrapper[4766]: E1209 03:13:10.013289 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs podName:5c9eb693-99eb-4b02-b33a-26d506eeb3f1 nodeName:}" failed. No retries permitted until 2025-12-09 03:13:42.013274882 +0000 UTC m=+103.722580308 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs") pod "network-metrics-daemon-z6qth" (UID: "5c9eb693-99eb-4b02-b33a-26d506eeb3f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.052553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.052611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.052624 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.052645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.052659 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.157282 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.157358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.157382 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.157416 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.157436 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.261517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.261552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.261561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.261580 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.261590 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.364955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.365009 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.365022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.365045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.365061 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.467500 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.467559 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.467569 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.467587 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.467599 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.570062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.570103 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.570115 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.570132 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.570143 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.673068 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.673174 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.673205 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.673276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.673305 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.775909 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.775951 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.775962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.775980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.775992 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.839162 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.839177 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.839200 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:10 crc kubenswrapper[4766]: E1209 03:13:10.839502 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:10 crc kubenswrapper[4766]: E1209 03:13:10.839716 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:10 crc kubenswrapper[4766]: E1209 03:13:10.839883 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.879859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.879913 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.879928 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.879947 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.879960 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.983582 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.983660 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.983670 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.983685 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:10 crc kubenswrapper[4766]: I1209 03:13:10.983696 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:10Z","lastTransitionTime":"2025-12-09T03:13:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.086471 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.086525 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.086537 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.086556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.086567 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.095527 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/0.log" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.095595 4766 generic.go:334] "Generic (PLEG): container finished" podID="c83a9d31-9c87-4a13-ab9a-2992e852eb47" containerID="d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0" exitCode=1 Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.095631 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gx9l2" event={"ID":"c83a9d31-9c87-4a13-ab9a-2992e852eb47","Type":"ContainerDied","Data":"d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.096191 4766 scope.go:117] "RemoveContainer" containerID="d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.114174 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.136096 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:49Z\\\",\\\"message\\\":\\\"d to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z]\\\\nI1209 03:12:49.722948 6401 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722976 6401 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722938 6401 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.147424 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.158522 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.172068 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.186740 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.190467 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.190492 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.190502 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.190517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.190527 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.196543 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.209442 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.223034 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:10Z\\\",\\\"message\\\":\\\"2025-12-09T03:12:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395\\\\n2025-12-09T03:12:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395 to /host/opt/cni/bin/\\\\n2025-12-09T03:12:25Z [verbose] multus-daemon started\\\\n2025-12-09T03:12:25Z [verbose] Readiness Indicator file check\\\\n2025-12-09T03:13:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.235279 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.250877 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.265696 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.279920 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.292671 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.292707 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.292718 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.292737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.292751 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.293671 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.308446 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.323203 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.335855 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.395399 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.395444 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.395452 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.395467 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.395478 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.499240 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.499285 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.499297 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.499317 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.499328 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.601804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.601915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.601940 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.601987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.602016 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.648763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.648840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.648854 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.648872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.648885 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: E1209 03:13:11.668562 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.673828 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.673884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.673895 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.673915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.673926 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: E1209 03:13:11.687985 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.692488 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.692521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.692530 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.692545 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.692555 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: E1209 03:13:11.705993 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.710316 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.710358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.710367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.710398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.710410 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: E1209 03:13:11.725182 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.735102 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.735150 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.735162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.735183 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.735226 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: E1209 03:13:11.749369 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:11Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:11 crc kubenswrapper[4766]: E1209 03:13:11.749516 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.751308 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.751345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.751356 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.751369 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.751379 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.838470 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:11 crc kubenswrapper[4766]: E1209 03:13:11.838628 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.854743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.854792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.854804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.854825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.854837 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.958437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.958517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.958541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.958600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:11 crc kubenswrapper[4766]: I1209 03:13:11.958625 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:11Z","lastTransitionTime":"2025-12-09T03:13:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.062255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.062318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.062331 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.062351 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.062361 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.100573 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/0.log" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.100654 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gx9l2" event={"ID":"c83a9d31-9c87-4a13-ab9a-2992e852eb47","Type":"ContainerStarted","Data":"1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.117097 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.129601 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.146548 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.160164 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.165344 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.165412 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.165430 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.165450 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.165461 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.174566 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.191510 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.209193 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.234516 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.267382 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:49Z\\\",\\\"message\\\":\\\"d to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z]\\\\nI1209 03:12:49.722948 6401 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722976 6401 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722938 6401 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.268131 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.268202 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.268235 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.268258 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.268274 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.283761 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.299884 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.317096 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.334447 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.349295 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.366081 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.371262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.371310 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.371324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.371343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.371355 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.392248 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:10Z\\\",\\\"message\\\":\\\"2025-12-09T03:12:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395\\\\n2025-12-09T03:12:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395 to /host/opt/cni/bin/\\\\n2025-12-09T03:12:25Z [verbose] multus-daemon started\\\\n2025-12-09T03:12:25Z [verbose] Readiness Indicator file check\\\\n2025-12-09T03:13:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:13:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.420350 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:12Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.473807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.474161 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.474248 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.474329 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.474408 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.577034 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.577081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.577094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.577117 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.577128 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.680626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.680721 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.680743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.680770 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.680789 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.783647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.783709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.783723 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.783748 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.783764 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.839365 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:12 crc kubenswrapper[4766]: E1209 03:13:12.839650 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.839735 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.841196 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:12 crc kubenswrapper[4766]: E1209 03:13:12.841355 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:12 crc kubenswrapper[4766]: E1209 03:13:12.841472 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.841849 4766 scope.go:117] "RemoveContainer" containerID="aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.886884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.886944 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.886962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.886989 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.887006 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.989695 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.989832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.989929 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.990004 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:12 crc kubenswrapper[4766]: I1209 03:13:12.990071 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:12Z","lastTransitionTime":"2025-12-09T03:13:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.093616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.093668 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.093681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.093706 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.093721 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:13Z","lastTransitionTime":"2025-12-09T03:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.107387 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/2.log" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.111537 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.111975 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.129405 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.143638 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.157127 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.172525 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.187325 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.196587 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.196618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.196629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.196646 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.196657 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:13Z","lastTransitionTime":"2025-12-09T03:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.212202 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:49Z\\\",\\\"message\\\":\\\"d to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z]\\\\nI1209 03:12:49.722948 6401 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722976 6401 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722938 6401 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:13:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.230308 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.250171 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.269399 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.288910 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.299572 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.299671 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.299698 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.299738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.299764 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:13Z","lastTransitionTime":"2025-12-09T03:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.308812 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.329699 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.353776 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:10Z\\\",\\\"message\\\":\\\"2025-12-09T03:12:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395\\\\n2025-12-09T03:12:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395 to /host/opt/cni/bin/\\\\n2025-12-09T03:12:25Z [verbose] multus-daemon started\\\\n2025-12-09T03:12:25Z [verbose] Readiness Indicator file check\\\\n2025-12-09T03:13:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:13:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.373791 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.389314 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.402948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.403005 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.403017 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.403039 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.403053 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:13Z","lastTransitionTime":"2025-12-09T03:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.407444 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.420096 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:13Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.506233 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.506307 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.506325 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.506352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.506372 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:13Z","lastTransitionTime":"2025-12-09T03:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.609380 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.609434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.609443 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.609462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.609473 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:13Z","lastTransitionTime":"2025-12-09T03:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.712236 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.712307 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.712325 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.712356 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.712378 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:13Z","lastTransitionTime":"2025-12-09T03:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.815102 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.815159 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.815172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.815194 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.815208 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:13Z","lastTransitionTime":"2025-12-09T03:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.838967 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:13 crc kubenswrapper[4766]: E1209 03:13:13.839135 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.918596 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.918654 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.918665 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.918685 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:13 crc kubenswrapper[4766]: I1209 03:13:13.918699 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:13Z","lastTransitionTime":"2025-12-09T03:13:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.021977 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.022019 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.022027 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.022045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.022057 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.120593 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/3.log" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.121960 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/2.log" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.127156 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" exitCode=1 Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.127349 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.127453 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.127740 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.127772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.127804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.127828 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.127644 4766 scope.go:117] "RemoveContainer" containerID="aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.128094 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:13:14 crc kubenswrapper[4766]: E1209 03:13:14.128327 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.155616 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.174034 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.188345 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.203864 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.217308 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:10Z\\\",\\\"message\\\":\\\"2025-12-09T03:12:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395\\\\n2025-12-09T03:12:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395 to /host/opt/cni/bin/\\\\n2025-12-09T03:12:25Z [verbose] multus-daemon started\\\\n2025-12-09T03:12:25Z [verbose] Readiness Indicator file check\\\\n2025-12-09T03:13:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:13:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.230962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.231006 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.231061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.231086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.231104 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.233384 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.246686 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.258615 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.275273 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.291912 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.309739 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.326688 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.334035 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.334081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.334093 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.334114 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.334128 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.347594 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.366329 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.392697 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aea14747c97a9fb23f1d6782500f1b0fa793b957b2d8c0a76ac7b3d09ae931ab\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:12:49Z\\\",\\\"message\\\":\\\"d to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:12:49Z is after 2025-08-24T17:21:41Z]\\\\nI1209 03:12:49.722948 6401 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722976 6401 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI1209 03:12:49.722938 6401 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"4607c9b7-15f9-4ba0-86e5-0021ba7e4488\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:14Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 03:13:13.826095 6778 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:13:13.826125 6778 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:13:13.826165 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:13:13.826201 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:13:13.826171 6778 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:13:13.826298 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:13:13.826324 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:13:13.826357 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 03:13:13.826381 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:13:13.826398 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 03:13:13.826407 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 03:13:13.826448 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 03:13:13.826491 6778 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:13:13.826549 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:13:13.826554 6778 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:13:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.405281 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.417942 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:14Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.437309 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.437383 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.437478 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.437555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.437586 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.540810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.540898 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.540932 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.540959 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.540979 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.645121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.645196 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.645296 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.645334 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.645361 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.748174 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.748256 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.748277 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.748298 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.748313 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.839002 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:14 crc kubenswrapper[4766]: E1209 03:13:14.839192 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.839562 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.839605 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:14 crc kubenswrapper[4766]: E1209 03:13:14.839652 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:14 crc kubenswrapper[4766]: E1209 03:13:14.839866 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.851530 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.851576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.851589 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.851612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.851630 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.955125 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.955246 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.955275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.955346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:14 crc kubenswrapper[4766]: I1209 03:13:14.955390 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:14Z","lastTransitionTime":"2025-12-09T03:13:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.059341 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.059473 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.059494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.059516 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.059533 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.133627 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/3.log" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.138106 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:13:15 crc kubenswrapper[4766]: E1209 03:13:15.138496 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.155922 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.163574 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.163637 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.163652 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.163678 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.163697 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.175912 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.192046 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.205755 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.221444 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.246086 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.272111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.272193 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.272262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.272299 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.272326 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.284766 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:14Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 03:13:13.826095 6778 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:13:13.826125 6778 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:13:13.826165 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:13:13.826201 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:13:13.826171 6778 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:13:13.826298 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:13:13.826324 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:13:13.826357 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 03:13:13.826381 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:13:13.826398 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 03:13:13.826407 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 03:13:13.826448 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 03:13:13.826491 6778 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:13:13.826549 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:13:13.826554 6778 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:13:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.306049 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.323237 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:10Z\\\",\\\"message\\\":\\\"2025-12-09T03:12:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395\\\\n2025-12-09T03:12:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395 to /host/opt/cni/bin/\\\\n2025-12-09T03:12:25Z [verbose] multus-daemon started\\\\n2025-12-09T03:12:25Z [verbose] Readiness Indicator file check\\\\n2025-12-09T03:13:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:13:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.337108 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.357112 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.374753 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.376511 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.376558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.376575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.376599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.376618 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.389186 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.403144 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.423605 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.440916 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.456697 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:15Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.479565 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.479621 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.479634 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.479656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.479670 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.582743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.582817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.582854 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.582881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.582896 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.686386 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.686462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.686482 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.686513 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.686537 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.789586 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.789692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.789718 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.789756 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.789783 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.838575 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:15 crc kubenswrapper[4766]: E1209 03:13:15.838832 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.894195 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.894295 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.894315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.894346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.894365 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.997831 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.997896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.997906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.997925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:15 crc kubenswrapper[4766]: I1209 03:13:15.997937 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:15Z","lastTransitionTime":"2025-12-09T03:13:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.101734 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.101808 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.101828 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.101856 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.101878 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:16Z","lastTransitionTime":"2025-12-09T03:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.204999 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.205680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.205743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.205807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.205864 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:16Z","lastTransitionTime":"2025-12-09T03:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.308171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.308473 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.308541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.308605 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.308661 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:16Z","lastTransitionTime":"2025-12-09T03:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.412038 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.412085 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.412097 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.412114 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.412125 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:16Z","lastTransitionTime":"2025-12-09T03:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.514971 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.515029 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.515046 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.515064 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.515075 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:16Z","lastTransitionTime":"2025-12-09T03:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.618969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.619047 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.619068 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.619098 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.619120 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:16Z","lastTransitionTime":"2025-12-09T03:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.722656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.722738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.722776 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.722809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.722831 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:16Z","lastTransitionTime":"2025-12-09T03:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.826133 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.826239 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.826267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.826302 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.826328 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:16Z","lastTransitionTime":"2025-12-09T03:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.838496 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.838593 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.838651 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:16 crc kubenswrapper[4766]: E1209 03:13:16.838858 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:16 crc kubenswrapper[4766]: E1209 03:13:16.839032 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:16 crc kubenswrapper[4766]: E1209 03:13:16.839322 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.929722 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.929784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.929797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.929820 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:16 crc kubenswrapper[4766]: I1209 03:13:16.929834 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:16Z","lastTransitionTime":"2025-12-09T03:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.032750 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.032844 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.032869 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.032906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.032931 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.139804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.139884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.139921 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.139951 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.139978 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.243611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.243698 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.243717 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.243743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.243763 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.347488 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.347564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.347582 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.347608 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.347628 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.451434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.451522 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.451542 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.451572 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.451592 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.555135 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.555190 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.555242 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.555270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.555290 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.658928 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.659027 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.659059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.659096 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.659121 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.761938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.762020 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.762046 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.762081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.762103 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.838786 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:17 crc kubenswrapper[4766]: E1209 03:13:17.838987 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.865067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.865129 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.865145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.865170 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.865186 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.968586 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.968827 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.968843 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.968873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:17 crc kubenswrapper[4766]: I1209 03:13:17.968893 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:17Z","lastTransitionTime":"2025-12-09T03:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.074579 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.075145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.075169 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.075197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.075290 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:18Z","lastTransitionTime":"2025-12-09T03:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.178761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.178832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.178853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.178881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.178903 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:18Z","lastTransitionTime":"2025-12-09T03:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.283140 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.283206 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.283270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.283310 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.283330 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:18Z","lastTransitionTime":"2025-12-09T03:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.430006 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.430057 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.430069 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.430096 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.430111 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:18Z","lastTransitionTime":"2025-12-09T03:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.532655 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.532707 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.532716 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.532738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.532752 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:18Z","lastTransitionTime":"2025-12-09T03:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.636576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.636890 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.636911 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.636977 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.637001 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:18Z","lastTransitionTime":"2025-12-09T03:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.740588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.740645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.740658 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.740684 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.740700 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:18Z","lastTransitionTime":"2025-12-09T03:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.839063 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.839316 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.839368 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:18 crc kubenswrapper[4766]: E1209 03:13:18.839431 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:18 crc kubenswrapper[4766]: E1209 03:13:18.839611 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:18 crc kubenswrapper[4766]: E1209 03:13:18.839752 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.844306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.844354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.844366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.844382 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.844395 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:18Z","lastTransitionTime":"2025-12-09T03:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.859642 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.875310 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.890447 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.907592 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.927825 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.947811 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.947867 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.947882 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.947903 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.947919 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:18Z","lastTransitionTime":"2025-12-09T03:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.965240 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:14Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 03:13:13.826095 6778 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:13:13.826125 6778 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:13:13.826165 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:13:13.826201 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:13:13.826171 6778 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:13:13.826298 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:13:13.826324 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:13:13.826357 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 03:13:13.826381 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:13:13.826398 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 03:13:13.826407 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 03:13:13.826448 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 03:13:13.826491 6778 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:13:13.826549 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:13:13.826554 6778 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:13:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.981162 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:18 crc kubenswrapper[4766]: I1209 03:13:18.995046 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:18Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.011570 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.025587 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.042254 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.051960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.052156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.052191 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.052255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.052286 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.058580 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.076645 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:10Z\\\",\\\"message\\\":\\\"2025-12-09T03:12:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395\\\\n2025-12-09T03:12:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395 to /host/opt/cni/bin/\\\\n2025-12-09T03:12:25Z [verbose] multus-daemon started\\\\n2025-12-09T03:12:25Z [verbose] Readiness Indicator file check\\\\n2025-12-09T03:13:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:13:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.091781 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.109259 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.122491 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.135946 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:19Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.155086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.155138 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.155156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.155184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.155204 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.258967 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.259575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.259595 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.259626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.259648 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.363168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.363285 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.363314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.363348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.363376 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.466405 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.466496 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.466523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.466558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.466683 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.571558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.571643 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.571662 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.571687 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.571704 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.675239 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.675366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.675388 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.675413 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.675433 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.779262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.779309 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.779319 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.779346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.779356 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.838612 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:19 crc kubenswrapper[4766]: E1209 03:13:19.838898 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.882724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.882792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.882809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.882833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.882851 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.985854 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.985923 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.985943 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.985970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:19 crc kubenswrapper[4766]: I1209 03:13:19.985997 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:19Z","lastTransitionTime":"2025-12-09T03:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.089404 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.089451 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.089462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.089480 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.089492 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:20Z","lastTransitionTime":"2025-12-09T03:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.198685 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.198835 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.198880 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.198903 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.198923 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:20Z","lastTransitionTime":"2025-12-09T03:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.302686 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.302760 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.302773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.302794 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.302806 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:20Z","lastTransitionTime":"2025-12-09T03:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.406923 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.407336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.407352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.407375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.407387 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:20Z","lastTransitionTime":"2025-12-09T03:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.511702 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.512086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.512366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.512551 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.512715 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:20Z","lastTransitionTime":"2025-12-09T03:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.616611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.616711 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.616730 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.616762 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.616778 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:20Z","lastTransitionTime":"2025-12-09T03:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.666154 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.666431 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.666595 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.666628 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:24.666593457 +0000 UTC m=+146.375898963 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.666698 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:14:24.66668095 +0000 UTC m=+146.375986576 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.720035 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.720095 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.720109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.720133 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.720145 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:20Z","lastTransitionTime":"2025-12-09T03:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.767410 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.767508 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.767546 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.767650 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.767756 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.767789 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.767806 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.767762 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:14:24.767736455 +0000 UTC m=+146.477041881 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.767867 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.767947 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.767975 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.767893 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 03:14:24.767866579 +0000 UTC m=+146.477172015 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.768080 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 03:14:24.768046484 +0000 UTC m=+146.477351940 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.822647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.823967 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.824035 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.824070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.824094 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:20Z","lastTransitionTime":"2025-12-09T03:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.838581 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.838700 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.838818 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.838624 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.839001 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:20 crc kubenswrapper[4766]: E1209 03:13:20.839123 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.927751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.928316 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.928347 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.928376 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:20 crc kubenswrapper[4766]: I1209 03:13:20.928396 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:20Z","lastTransitionTime":"2025-12-09T03:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.032004 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.032062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.032075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.032097 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.032111 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.135716 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.136035 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.136155 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.136244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.136336 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.241469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.241550 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.242620 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.242673 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.242696 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.347588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.347639 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.347651 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.347675 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.347689 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.451680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.451774 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.451803 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.451840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.451864 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.556009 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.556096 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.556126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.556168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.556193 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.659701 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.659960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.659979 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.660009 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.660030 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.763094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.763173 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.763196 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.763264 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.763289 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.839194 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:21 crc kubenswrapper[4766]: E1209 03:13:21.839433 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.865750 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.865823 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.865844 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.865872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.865893 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.969338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.969391 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.969408 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.969432 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:21 crc kubenswrapper[4766]: I1209 03:13:21.969447 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:21Z","lastTransitionTime":"2025-12-09T03:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.073513 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.073591 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.073611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.073642 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.073661 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.132000 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.132046 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.132058 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.132076 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.132089 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: E1209 03:13:22.152510 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:22Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.159172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.159275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.159295 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.159361 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.159382 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: E1209 03:13:22.182139 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:22Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.189780 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.190102 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.190309 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.190448 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.190586 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: E1209 03:13:22.214904 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:22Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.220321 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.220454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.220563 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.220686 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.220788 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: E1209 03:13:22.242814 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:22Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.248757 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.248982 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.249151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.249340 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.249522 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: E1209 03:13:22.272127 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:22Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:22 crc kubenswrapper[4766]: E1209 03:13:22.272686 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.275137 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.275256 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.275279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.275310 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.275329 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.378532 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.378575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.378587 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.378604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.378614 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.482198 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.482259 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.482272 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.482291 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.482303 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.585095 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.585144 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.585156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.585174 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.585186 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.688374 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.688766 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.688894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.689033 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.689169 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.791868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.792157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.792250 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.792388 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.792444 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.838634 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.838721 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.838641 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:22 crc kubenswrapper[4766]: E1209 03:13:22.838886 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:22 crc kubenswrapper[4766]: E1209 03:13:22.839023 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:22 crc kubenswrapper[4766]: E1209 03:13:22.839118 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.896018 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.896096 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.896116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.896145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.896169 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.999904 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.999960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:22 crc kubenswrapper[4766]: I1209 03:13:22.999981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.000008 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.000028 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:22Z","lastTransitionTime":"2025-12-09T03:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.103647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.103740 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.103765 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.103803 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.103829 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:23Z","lastTransitionTime":"2025-12-09T03:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.208326 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.208408 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.208435 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.208479 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.208515 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:23Z","lastTransitionTime":"2025-12-09T03:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.313414 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.313500 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.313518 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.313554 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.313584 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:23Z","lastTransitionTime":"2025-12-09T03:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.417536 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.417600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.417614 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.417640 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.417655 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:23Z","lastTransitionTime":"2025-12-09T03:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.520682 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.520772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.520792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.520825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.520849 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:23Z","lastTransitionTime":"2025-12-09T03:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.624446 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.624508 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.624521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.624543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.624587 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:23Z","lastTransitionTime":"2025-12-09T03:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.728646 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.728724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.728744 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.728772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.728793 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:23Z","lastTransitionTime":"2025-12-09T03:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.831919 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.832124 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.832141 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.832169 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.832185 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:23Z","lastTransitionTime":"2025-12-09T03:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.838113 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:23 crc kubenswrapper[4766]: E1209 03:13:23.838339 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.938139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.938200 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.938254 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.938281 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:23 crc kubenswrapper[4766]: I1209 03:13:23.938301 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:23Z","lastTransitionTime":"2025-12-09T03:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.042286 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.042345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.042357 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.042374 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.042385 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.145896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.146065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.146099 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.146133 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.146162 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.250190 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.250268 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.250280 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.250305 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.250319 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.354383 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.354439 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.354453 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.354470 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.354487 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.457502 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.457577 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.457591 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.457616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.457630 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.560757 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.560852 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.560872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.560906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.560932 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.665686 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.665766 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.665786 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.665817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.665838 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.769365 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.769432 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.769450 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.769473 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.769493 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.838497 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:24 crc kubenswrapper[4766]: E1209 03:13:24.838751 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.839195 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:24 crc kubenswrapper[4766]: E1209 03:13:24.839394 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.839753 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:24 crc kubenswrapper[4766]: E1209 03:13:24.839973 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.873038 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.873099 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.873121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.873152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.873175 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.977543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.977631 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.977656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.977691 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:24 crc kubenswrapper[4766]: I1209 03:13:24.977716 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:24Z","lastTransitionTime":"2025-12-09T03:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.082507 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.082562 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.082575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.082597 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.082610 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:25Z","lastTransitionTime":"2025-12-09T03:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.185731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.186122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.186330 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.186491 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.186619 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:25Z","lastTransitionTime":"2025-12-09T03:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.290197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.290777 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.290970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.291489 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.291958 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:25Z","lastTransitionTime":"2025-12-09T03:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.394671 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.395018 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.395087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.395151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.395239 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:25Z","lastTransitionTime":"2025-12-09T03:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.499449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.499516 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.499536 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.499570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.499595 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:25Z","lastTransitionTime":"2025-12-09T03:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.602726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.603539 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.603641 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.603761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.603870 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:25Z","lastTransitionTime":"2025-12-09T03:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.707420 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.707784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.707888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.708151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.708247 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:25Z","lastTransitionTime":"2025-12-09T03:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.811803 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.811881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.811895 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.811913 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.811925 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:25Z","lastTransitionTime":"2025-12-09T03:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.839026 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:25 crc kubenswrapper[4766]: E1209 03:13:25.839178 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.841056 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:13:25 crc kubenswrapper[4766]: E1209 03:13:25.841403 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.915205 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.915299 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.915314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.915337 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:25 crc kubenswrapper[4766]: I1209 03:13:25.915355 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:25Z","lastTransitionTime":"2025-12-09T03:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.019262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.019343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.019362 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.019394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.019415 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.122838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.122925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.122941 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.122964 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.122977 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.225932 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.225992 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.226004 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.226021 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.226036 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.328267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.328555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.328619 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.328679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.328733 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.433260 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.433651 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.433740 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.433813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.433877 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.536755 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.537136 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.537241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.537334 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.537415 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.641671 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.641747 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.641767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.641799 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.641818 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.744699 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.745138 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.745299 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.745619 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.745768 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.839258 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:26 crc kubenswrapper[4766]: E1209 03:13:26.839508 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.839544 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.839628 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:26 crc kubenswrapper[4766]: E1209 03:13:26.839641 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:26 crc kubenswrapper[4766]: E1209 03:13:26.839800 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.848714 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.848790 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.848810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.848831 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.848883 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.951619 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.951730 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.951750 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.951778 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:26 crc kubenswrapper[4766]: I1209 03:13:26.951797 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:26Z","lastTransitionTime":"2025-12-09T03:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.055294 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.055378 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.055406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.055460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.055493 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.158794 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.158863 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.158882 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.158909 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.158928 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.262192 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.262609 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.262892 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.263091 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.263281 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.367852 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.367934 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.367952 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.367983 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.368012 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.471540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.471615 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.471635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.471667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.471686 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.575515 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.575593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.575612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.575644 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.575667 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.679011 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.680401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.680453 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.680495 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.680518 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.783860 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.783905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.783917 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.783935 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.783946 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.839001 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:27 crc kubenswrapper[4766]: E1209 03:13:27.839539 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.886584 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.886648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.886663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.886689 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.886708 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.991711 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.991767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.991795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.991820 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:27 crc kubenswrapper[4766]: I1209 03:13:27.991836 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:27Z","lastTransitionTime":"2025-12-09T03:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.095317 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.095501 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.095534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.095574 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.095604 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:28Z","lastTransitionTime":"2025-12-09T03:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.198537 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.198915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.198988 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.199085 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.199173 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:28Z","lastTransitionTime":"2025-12-09T03:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.302305 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.302859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.303068 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.303270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.303518 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:28Z","lastTransitionTime":"2025-12-09T03:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.407569 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.407617 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.407635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.407658 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.407673 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:28Z","lastTransitionTime":"2025-12-09T03:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.510883 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.511345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.511461 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.511585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.511707 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:28Z","lastTransitionTime":"2025-12-09T03:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.615121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.615611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.615792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.615990 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.616179 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:28Z","lastTransitionTime":"2025-12-09T03:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.719927 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.719991 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.720007 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.720032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.720053 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:28Z","lastTransitionTime":"2025-12-09T03:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.823894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.823953 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.823968 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.823989 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.824004 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:28Z","lastTransitionTime":"2025-12-09T03:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.838980 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:28 crc kubenswrapper[4766]: E1209 03:13:28.839087 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.839158 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.839159 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:28 crc kubenswrapper[4766]: E1209 03:13:28.839305 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:28 crc kubenswrapper[4766]: E1209 03:13:28.839578 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.854497 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.862347 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.922005 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.927268 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.927327 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.927341 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.927373 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.927389 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:28Z","lastTransitionTime":"2025-12-09T03:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.949277 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.964113 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.979948 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:28 crc kubenswrapper[4766]: I1209 03:13:28.996742 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:28Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.017624 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:14Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 03:13:13.826095 6778 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:13:13.826125 6778 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:13:13.826165 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:13:13.826201 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:13:13.826171 6778 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:13:13.826298 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:13:13.826324 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:13:13.826357 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 03:13:13.826381 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:13:13.826398 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 03:13:13.826407 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 03:13:13.826448 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 03:13:13.826491 6778 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:13:13.826549 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:13:13.826554 6778 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:13:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.030394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.030493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.030518 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.030550 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.030569 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.033842 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.049700 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:10Z\\\",\\\"message\\\":\\\"2025-12-09T03:12:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395\\\\n2025-12-09T03:12:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395 to /host/opt/cni/bin/\\\\n2025-12-09T03:12:25Z [verbose] multus-daemon started\\\\n2025-12-09T03:12:25Z [verbose] Readiness Indicator file check\\\\n2025-12-09T03:13:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:13:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.063065 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.078387 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.094415 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.106633 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.118357 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.133535 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.133608 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.133622 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.133641 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.133681 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.134793 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.149295 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.164556 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:29Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.236552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.236602 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.236616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.236635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.236647 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.340057 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.340113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.340125 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.340145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.340159 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.442655 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.442712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.442723 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.442741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.442753 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.545689 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.545740 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.545754 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.545771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.545782 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.649270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.649332 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.649346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.649367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.649382 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.752160 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.752230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.752243 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.752264 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.752276 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.838871 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:29 crc kubenswrapper[4766]: E1209 03:13:29.839595 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.855104 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.855160 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.855180 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.855203 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.855259 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.959486 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.959550 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.959571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.959599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:29 crc kubenswrapper[4766]: I1209 03:13:29.959618 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:29Z","lastTransitionTime":"2025-12-09T03:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.062227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.062273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.062281 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.062296 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.062307 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.167640 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.167719 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.167738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.167771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.167794 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.271162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.271255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.271275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.271303 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.271324 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.374987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.375054 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.375070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.375097 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.375116 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.478114 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.478188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.478230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.478257 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.478274 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.581618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.581675 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.581686 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.581728 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.581742 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.684818 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.684860 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.684873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.684893 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.684913 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.788361 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.788429 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.788445 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.788466 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.788482 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.839178 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.839283 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.839280 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:30 crc kubenswrapper[4766]: E1209 03:13:30.839496 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:30 crc kubenswrapper[4766]: E1209 03:13:30.839603 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:30 crc kubenswrapper[4766]: E1209 03:13:30.839687 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.892266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.892309 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.892324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.892340 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.892352 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.996525 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.996593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.996605 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.996629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:30 crc kubenswrapper[4766]: I1209 03:13:30.996643 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:30Z","lastTransitionTime":"2025-12-09T03:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.100533 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.100622 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.100645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.100675 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.100703 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:31Z","lastTransitionTime":"2025-12-09T03:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.204361 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.204455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.204481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.204521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.204547 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:31Z","lastTransitionTime":"2025-12-09T03:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.307922 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.308015 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.308036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.308072 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.308095 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:31Z","lastTransitionTime":"2025-12-09T03:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.411791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.411897 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.411918 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.412574 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.412618 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:31Z","lastTransitionTime":"2025-12-09T03:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.515798 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.515844 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.515854 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.515872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.515883 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:31Z","lastTransitionTime":"2025-12-09T03:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.618622 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.618690 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.618707 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.618735 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.618755 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:31Z","lastTransitionTime":"2025-12-09T03:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.721813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.721894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.721908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.721927 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.721938 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:31Z","lastTransitionTime":"2025-12-09T03:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.824476 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.824529 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.824540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.824561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.824575 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:31Z","lastTransitionTime":"2025-12-09T03:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.839337 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:31 crc kubenswrapper[4766]: E1209 03:13:31.840046 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.857072 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.927995 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.928041 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.928050 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.928069 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:31 crc kubenswrapper[4766]: I1209 03:13:31.928079 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:31Z","lastTransitionTime":"2025-12-09T03:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.030702 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.030769 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.030789 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.030817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.030839 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.134787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.134834 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.134845 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.134866 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.134878 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.238177 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.238306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.238339 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.238379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.238422 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.342350 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.342450 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.342494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.342525 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.342547 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.421257 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.421346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.421370 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.421401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.421423 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: E1209 03:13:32.440750 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.447931 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.448020 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.448057 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.448128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.448156 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: E1209 03:13:32.466296 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.472117 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.472173 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.472193 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.472329 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.472370 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: E1209 03:13:32.493614 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.498817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.498868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.498886 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.498911 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.498931 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: E1209 03:13:32.514823 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.520523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.520597 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.520618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.520645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.520663 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: E1209 03:13:32.544920 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"45cd1af0-b2e3-4c88-9c4c-c0bd21b8294d\\\",\\\"systemUUID\\\":\\\"2701ecc7-fbfe-4321-9495-f77bb9b59c76\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:32Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:32 crc kubenswrapper[4766]: E1209 03:13:32.545335 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.548340 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.548422 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.548454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.548496 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.548537 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.650957 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.651024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.651046 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.651073 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.651093 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.754434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.754564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.754598 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.754625 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.754644 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.839069 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.839121 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.839386 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:32 crc kubenswrapper[4766]: E1209 03:13:32.840572 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:32 crc kubenswrapper[4766]: E1209 03:13:32.840698 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:32 crc kubenswrapper[4766]: E1209 03:13:32.840875 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.857407 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.857466 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.857484 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.857509 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.857529 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.961083 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.961151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.961174 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.961205 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:32 crc kubenswrapper[4766]: I1209 03:13:32.961253 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:32Z","lastTransitionTime":"2025-12-09T03:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.064741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.064852 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.064875 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.064906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.064926 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:33Z","lastTransitionTime":"2025-12-09T03:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.169266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.169357 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.169375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.169404 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.169428 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:33Z","lastTransitionTime":"2025-12-09T03:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.273162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.273311 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.273345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.273389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.273415 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:33Z","lastTransitionTime":"2025-12-09T03:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.377564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.377639 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.377664 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.377697 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.377718 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:33Z","lastTransitionTime":"2025-12-09T03:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.481549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.481630 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.481685 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.481715 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.481734 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:33Z","lastTransitionTime":"2025-12-09T03:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.585356 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.585724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.585818 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.585911 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.585991 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:33Z","lastTransitionTime":"2025-12-09T03:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.689833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.689903 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.689926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.689955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.689976 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:33Z","lastTransitionTime":"2025-12-09T03:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.793391 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.793468 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.793481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.793509 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.793526 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:33Z","lastTransitionTime":"2025-12-09T03:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.838864 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:33 crc kubenswrapper[4766]: E1209 03:13:33.839052 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.897449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.897979 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.898122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.898537 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:33 crc kubenswrapper[4766]: I1209 03:13:33.898700 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:33Z","lastTransitionTime":"2025-12-09T03:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.001808 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.001878 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.001900 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.001928 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.001948 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.106123 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.106168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.106180 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.106197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.106207 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.209823 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.210260 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.210534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.210705 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.210851 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.314628 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.314715 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.314739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.314771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.314796 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.418184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.418300 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.418322 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.418353 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.418379 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.523471 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.523549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.523567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.523595 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.523618 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.627432 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.627514 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.627540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.627574 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.627597 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.729967 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.730016 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.730027 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.730042 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.730052 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.832737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.832787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.832797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.832813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.832825 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.839106 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.839140 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.839180 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:34 crc kubenswrapper[4766]: E1209 03:13:34.839302 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:34 crc kubenswrapper[4766]: E1209 03:13:34.839371 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:34 crc kubenswrapper[4766]: E1209 03:13:34.839426 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.935035 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.935102 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.935114 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.935132 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:34 crc kubenswrapper[4766]: I1209 03:13:34.935149 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:34Z","lastTransitionTime":"2025-12-09T03:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.037739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.037795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.037810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.037833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.037850 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.141327 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.141501 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.141524 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.141553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.142055 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.245597 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.246041 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.246293 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.246630 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.246750 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.350178 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.350699 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.350853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.351004 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.351128 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.454955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.455024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.455041 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.455069 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.455089 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.558642 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.558699 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.558716 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.558741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.558761 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.662466 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.662522 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.662531 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.662548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.662560 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.766319 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.766421 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.766447 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.766485 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.766509 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.839059 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:35 crc kubenswrapper[4766]: E1209 03:13:35.841056 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.869426 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.869496 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.869517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.869539 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.869556 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.972648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.972710 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.972725 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.972745 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:35 crc kubenswrapper[4766]: I1209 03:13:35.972758 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:35Z","lastTransitionTime":"2025-12-09T03:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.075389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.075450 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.075460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.075474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.075484 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:36Z","lastTransitionTime":"2025-12-09T03:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.178734 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.178828 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.178850 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.178883 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.178906 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:36Z","lastTransitionTime":"2025-12-09T03:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.282426 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.282500 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.282519 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.282547 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.282568 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:36Z","lastTransitionTime":"2025-12-09T03:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.386497 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.386593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.386620 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.386650 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.386671 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:36Z","lastTransitionTime":"2025-12-09T03:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.491203 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.491334 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.491356 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.491384 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.491405 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:36Z","lastTransitionTime":"2025-12-09T03:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.594437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.594484 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.594495 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.594517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.594531 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:36Z","lastTransitionTime":"2025-12-09T03:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.698056 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.698113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.698130 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.698156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.698176 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:36Z","lastTransitionTime":"2025-12-09T03:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.800795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.800847 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.800864 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.800882 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.800895 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:36Z","lastTransitionTime":"2025-12-09T03:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.839127 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.839127 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:36 crc kubenswrapper[4766]: E1209 03:13:36.839357 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.839287 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:36 crc kubenswrapper[4766]: E1209 03:13:36.839573 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:36 crc kubenswrapper[4766]: E1209 03:13:36.839727 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.903862 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.903910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.903920 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.903939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:36 crc kubenswrapper[4766]: I1209 03:13:36.903954 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:36Z","lastTransitionTime":"2025-12-09T03:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.007185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.007352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.007380 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.007423 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.007452 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.110357 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.110434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.110452 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.110480 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.110497 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.214419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.214494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.214513 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.214543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.214565 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.318055 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.318109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.318122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.318140 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.318150 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.422192 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.422344 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.422364 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.422391 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.422408 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.526154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.526257 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.526284 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.526315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.526338 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.628939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.628993 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.629005 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.629026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.629039 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.732460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.732548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.732567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.732597 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.732616 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.836343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.837042 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.837245 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.837407 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.837565 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.838747 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:37 crc kubenswrapper[4766]: E1209 03:13:37.838979 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.942002 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.942069 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.942088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.942116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:37 crc kubenswrapper[4766]: I1209 03:13:37.942137 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:37Z","lastTransitionTime":"2025-12-09T03:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.045551 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.045623 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.045643 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.045678 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.045705 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.149089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.149156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.149178 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.149207 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.149268 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.252147 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.252258 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.252279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.252306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.252327 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.356015 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.356642 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.356868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.357090 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.357319 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.461448 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.461896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.461980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.462068 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.462161 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.565728 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.566122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.566196 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.566318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.566399 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.670536 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.670604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.670627 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.670659 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.670683 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.774129 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.774206 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.774267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.774299 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.774322 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.839289 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:38 crc kubenswrapper[4766]: E1209 03:13:38.839594 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.840135 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:38 crc kubenswrapper[4766]: E1209 03:13:38.841943 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.841771 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:13:38 crc kubenswrapper[4766]: E1209 03:13:38.842781 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.840442 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:38 crc kubenswrapper[4766]: E1209 03:13:38.843369 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.864634 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"GCM_SHA384' detected.\\\\nW1209 03:12:16.850486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 03:12:16.850507 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 03:12:16.853267 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853327 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1209 03:12:16.853371 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853404 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1209 03:12:16.853444 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1209 03:12:16.853508 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1209 03:12:16.853455 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1209 03:12:16.853551 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1209 03:12:16.853568 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\"\\\\nF1209 03:12:16.854352 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1209 03:12:16.853658 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2002291129/tls.crt::/tmp/serving-cert-2002291129/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765249920\\\\\\\\\\\\\\\" (2025-12-09 03:12:00 +0000 UTC to 2026-01-08 03:12:01 +0000 UTC (now=2025-12-09 03:12:16.853620488 +0000 UTC))\\\\\\\"\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.879271 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.879462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.879495 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.879832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.879888 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.885038 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfcf95a6-7b63-4960-a237-8f129e815543\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0eb22dc854a71bea489149e0f1e58f081cff01df16293c8fe1d598637df38fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://718f69b9d9ed201e95c582efb87076801b7c41ce8aece300ff857246f3db8e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26ff5598bfad340172fc92296b957e1f3d1d7d353b8e55b1e4a5a924aef11842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47395b26469ff8193be923e11cb4dbdad8277c83eb4b27076dd37c9615bb2ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.907005 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.933910 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6f6bf8f-2ba7-4ce6-add4-72ea81f1a6cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe076c13be85d36b4a984a037b1f6958db9073dcb94ba38908145259d43ac38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59a1408e62adecfe4bfd6ab1a09ba2213f110fcd93fbf54ef7042febba52ccfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59a1408e62adecfe4bfd6ab1a09ba2213f110fcd93fbf54ef7042febba52ccfd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.957701 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f74377b-781a-48e2-8502-b4cb309ca21c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c3ec085d869d34e526512884b5dad62cc1fe8eeab2b20e5e7394f55666dcc2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6a5ddd7c86e52a226dc7b865d48b03fa9beac96c7be800d0f64a1c109524922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78f00948adfba79a2dfa1dc0463471b712100d1f5ec7f472abb97ab790bd8d99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b7e06a5ab1868b3d41878b8937875144566718e10806233e98b9754154d541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58a36af88a1c9fafe479878b3878809abec590395a8eb69feaa9df1d7ff8534\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://731d57a43e5481755f5aade6fd96b713977c020c9adf77bdf8e690c6e3414d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://731d57a43e5481755f5aade6fd96b713977c020c9adf77bdf8e690c6e3414d37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:11:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69b2347a0ec67392bdcf4654ce0514d17519e3edd44b53fe24bc2cf49372de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d69b2347a0ec67392bdcf4654ce0514d17519e3edd44b53fe24bc2cf49372de7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e49edd84f63ae4c39e39494cadfe96efd8db014c741eb998908fb3cc1da1cbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e49edd84f63ae4c39e39494cadfe96efd8db014c741eb998908fb3cc1da1cbea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.981133 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8d2c6a2bb4d56026a11b09106f40c70be0fdd101bb23efa1a2a5b47b7324bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e455ef9dabb5203c06e4675ab25a4b8f3d8ba47bcd1aaf5577ba69f612b969be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.983976 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.984033 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.984052 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.984077 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.984095 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:38Z","lastTransitionTime":"2025-12-09T03:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:38 crc kubenswrapper[4766]: I1209 03:13:38.998527 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c1ab33f6d915f0c6dd9d744c2c10c651160dc4e284ca33b9cc7a1d6bf543bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:38Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.019261 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.037145 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16a9cd9d-6151-4f45-9419-cd35413553c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:11:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7267fc821a22369df2c9d7987b13536d1e1b8631a0c39c35ce4514e510a306b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:11:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abef51cabad2c6c92f5085e1028889f65c61361006d6bb205808bd8db41c3345\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d0d0d07fc42c84e9f57f8730e204df9e896b5d2fd4db583843da2bb262cd06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:11:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.064612 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99b9b55d-a081-4c84-8535-58468c316659\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abc66814e10785e2f678ee88dcb9d5366233bdbeccb40eb0a229499862ee6695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b1d4a46fa2439627cd8a9f999bdbc93d914b93641e6b574a0bf509619ab433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c569297b4b7569c18330373712b61b932005bf49f0223003b0c4f86f86babde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36268ce330e33ba0f2a2ae026b96a673ce30b0b765615b14857874c85275b77b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b51c2de2aad97e6ea56e5b9d37dc6e888968f6a7815d8a2d50fd25508d1a2a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://061344ba63a47ffa2f23024041c8e6823c38fe125ea6f16abfe4a5a5bcd7a5dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a17d195ca0fdb851728beea795ea55d5e6c28a48bfff81fe8789428c71257484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9rzz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xm6zk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.088621 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.088673 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.088682 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.088699 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.088709 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:39Z","lastTransitionTime":"2025-12-09T03:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.092587 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:14Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 03:13:13.826095 6778 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1209 03:13:13.826125 6778 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1209 03:13:13.826165 6778 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1209 03:13:13.826201 6778 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1209 03:13:13.826171 6778 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 03:13:13.826298 6778 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1209 03:13:13.826324 6778 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1209 03:13:13.826357 6778 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1209 03:13:13.826381 6778 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 03:13:13.826398 6778 handler.go:208] Removed *v1.Node event handler 2\\\\nI1209 03:13:13.826407 6778 handler.go:208] Removed *v1.Node event handler 7\\\\nI1209 03:13:13.826448 6778 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1209 03:13:13.826491 6778 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1209 03:13:13.826549 6778 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 03:13:13.826554 6778 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:13:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T03:12:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8tw62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-62t52\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.109762 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l7fcf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8f5525-c018-45f6-8f73-355ac763742e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://774b271073c57fdc1bb8d00ef9dffbf0e7db4507d2c369de182658262b7d9ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jf7jm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l7fcf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.126702 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6qth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6qth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.150998 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcb116608b582ebef8d1e46ebc774e0d3f977fe17134f2f71e94b79b756e32e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.170065 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.182318 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7rlr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfda2870-98c2-41d7-82f4-45e9b5b18460\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://549db356e5a84d689a6abf67c234b143a249dc0f384e90c1a6f42f47df779502\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9s2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7rlr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.192019 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.192080 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.192099 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.192133 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.192155 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:39Z","lastTransitionTime":"2025-12-09T03:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.200746 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a42b369b-e4ad-447c-b9b1-5c2461116838\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2877792840b2161d9c50cd2f4cdf9e6c606f4a20e8560570abb8b3e8f5295d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2rzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-db9hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.224312 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gx9l2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c83a9d31-9c87-4a13-ab9a-2992e852eb47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:13:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T03:13:10Z\\\",\\\"message\\\":\\\"2025-12-09T03:12:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395\\\\n2025-12-09T03:12:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_973549eb-58b4-45b6-bdfe-c4d395cac395 to /host/opt/cni/bin/\\\\n2025-12-09T03:12:25Z [verbose] multus-daemon started\\\\n2025-12-09T03:12:25Z [verbose] Readiness Indicator file check\\\\n2025-12-09T03:13:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T03:12:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:13:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfzcd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gx9l2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.244061 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3520fc8f-8421-4ce0-b98a-f08f96ce2f2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T03:12:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bfda007f0fad974924effcc2a8d974299c3f458df7dc58541b73ee388d86b55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd82237847bae8c02b946821a4dbf6fb09f76aaa32efcf8d4c504a1c64b8665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T03:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w59ps\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T03:12:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nc6gc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T03:13:39Z is after 2025-08-24T17:21:41Z" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.296601 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.296727 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.296748 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.296787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.296827 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:39Z","lastTransitionTime":"2025-12-09T03:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.399462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.399534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.399560 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.399596 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.399635 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:39Z","lastTransitionTime":"2025-12-09T03:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.503279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.503348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.503397 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.503427 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.503446 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:39Z","lastTransitionTime":"2025-12-09T03:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.606944 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.607017 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.607034 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.607065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.607088 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:39Z","lastTransitionTime":"2025-12-09T03:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.710786 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.710864 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.710886 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.710926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.710967 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:39Z","lastTransitionTime":"2025-12-09T03:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.814094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.814596 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.816073 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.816328 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.816559 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:39Z","lastTransitionTime":"2025-12-09T03:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.839075 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:39 crc kubenswrapper[4766]: E1209 03:13:39.839567 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.920439 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.920524 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.920543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.920573 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:39 crc kubenswrapper[4766]: I1209 03:13:39.920593 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:39Z","lastTransitionTime":"2025-12-09T03:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.024355 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.024417 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.024432 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.024455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.024470 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.128419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.128514 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.128533 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.128553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.128567 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.231772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.231826 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.231842 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.231864 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.231879 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.334939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.335042 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.335121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.335150 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.335167 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.439482 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.439586 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.439613 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.439650 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.439675 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.543737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.543787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.543799 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.543816 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.543829 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.646717 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.646763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.646775 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.646794 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.646807 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.750276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.750360 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.750384 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.750415 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.750441 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.839449 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.839589 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.839947 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:40 crc kubenswrapper[4766]: E1209 03:13:40.840018 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:40 crc kubenswrapper[4766]: E1209 03:13:40.840145 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:40 crc kubenswrapper[4766]: E1209 03:13:40.840353 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.853726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.854168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.854398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.854504 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.854593 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.957388 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.957442 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.957452 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.957469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:40 crc kubenswrapper[4766]: I1209 03:13:40.957480 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:40Z","lastTransitionTime":"2025-12-09T03:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.061126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.061181 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.061192 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.061245 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.061261 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.164936 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.165517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.165577 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.165621 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.165660 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.269444 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.269500 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.269517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.269541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.269557 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.372790 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.372884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.372908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.372947 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.372969 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.478003 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.478074 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.478092 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.478119 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.478138 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.583119 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.583201 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.583251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.583279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.583303 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.687636 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.687712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.687726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.687754 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.687769 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.790960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.791022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.791036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.791059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.791073 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.838029 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:41 crc kubenswrapper[4766]: E1209 03:13:41.838180 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.894315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.894379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.894398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.894419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.894433 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.998430 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.998509 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.998524 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.998548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:41 crc kubenswrapper[4766]: I1209 03:13:41.998564 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:41Z","lastTransitionTime":"2025-12-09T03:13:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.049373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:42 crc kubenswrapper[4766]: E1209 03:13:42.049704 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:13:42 crc kubenswrapper[4766]: E1209 03:13:42.050166 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs podName:5c9eb693-99eb-4b02-b33a-26d506eeb3f1 nodeName:}" failed. No retries permitted until 2025-12-09 03:14:46.050127969 +0000 UTC m=+167.759433435 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs") pod "network-metrics-daemon-z6qth" (UID: "5c9eb693-99eb-4b02-b33a-26d506eeb3f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.101304 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.102433 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.102479 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.102510 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.102521 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:42Z","lastTransitionTime":"2025-12-09T03:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.205933 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.205998 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.206016 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.206041 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.206059 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:42Z","lastTransitionTime":"2025-12-09T03:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.310322 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.310851 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.311021 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.311276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.311491 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:42Z","lastTransitionTime":"2025-12-09T03:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.415127 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.415195 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.415246 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.415279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.415301 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:42Z","lastTransitionTime":"2025-12-09T03:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.518538 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.518594 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.518606 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.518625 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.518637 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:42Z","lastTransitionTime":"2025-12-09T03:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.622440 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.622837 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.623046 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.623395 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.623644 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:42Z","lastTransitionTime":"2025-12-09T03:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.728033 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.728094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.728104 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.728122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.728134 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:42Z","lastTransitionTime":"2025-12-09T03:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.748397 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.748469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.748492 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.748520 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.748539 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T03:13:42Z","lastTransitionTime":"2025-12-09T03:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.811094 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx"] Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.811639 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.814023 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.814838 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.815334 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.815488 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.839407 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:42 crc kubenswrapper[4766]: E1209 03:13:42.839872 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.840099 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:42 crc kubenswrapper[4766]: E1209 03:13:42.840325 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.840184 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:42 crc kubenswrapper[4766]: E1209 03:13:42.840627 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.861046 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ef10d1ca-da06-4a53-bbb1-7040a6237b28-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.861140 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef10d1ca-da06-4a53-bbb1-7040a6237b28-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.861263 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ef10d1ca-da06-4a53-bbb1-7040a6237b28-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.861317 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef10d1ca-da06-4a53-bbb1-7040a6237b28-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.861388 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef10d1ca-da06-4a53-bbb1-7040a6237b28-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.881820 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7rlr6" podStartSLOduration=79.881767284 podStartE2EDuration="1m19.881767284s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:42.860867228 +0000 UTC m=+104.570172674" watchObservedRunningTime="2025-12-09 03:13:42.881767284 +0000 UTC m=+104.591072710" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.901735 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podStartSLOduration=79.901703224 podStartE2EDuration="1m19.901703224s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:42.882094823 +0000 UTC m=+104.591400279" watchObservedRunningTime="2025-12-09 03:13:42.901703224 +0000 UTC m=+104.611008690" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.901924 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gx9l2" podStartSLOduration=79.901916939 podStartE2EDuration="1m19.901916939s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:42.901641982 +0000 UTC m=+104.610947418" watchObservedRunningTime="2025-12-09 03:13:42.901916939 +0000 UTC m=+104.611222405" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.937677 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nc6gc" podStartSLOduration=79.937650957 podStartE2EDuration="1m19.937650957s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:42.918104878 +0000 UTC m=+104.627410384" watchObservedRunningTime="2025-12-09 03:13:42.937650957 +0000 UTC m=+104.646956373" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.961901 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ef10d1ca-da06-4a53-bbb1-7040a6237b28-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.962232 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef10d1ca-da06-4a53-bbb1-7040a6237b28-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.962336 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef10d1ca-da06-4a53-bbb1-7040a6237b28-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.962049 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ef10d1ca-da06-4a53-bbb1-7040a6237b28-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.962464 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ef10d1ca-da06-4a53-bbb1-7040a6237b28-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.962621 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef10d1ca-da06-4a53-bbb1-7040a6237b28-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.962655 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ef10d1ca-da06-4a53-bbb1-7040a6237b28-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.963959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef10d1ca-da06-4a53-bbb1-7040a6237b28-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.972805 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=49.972773687 podStartE2EDuration="49.972773687s" podCreationTimestamp="2025-12-09 03:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:42.954689448 +0000 UTC m=+104.663994874" watchObservedRunningTime="2025-12-09 03:13:42.972773687 +0000 UTC m=+104.682079153" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.982197 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef10d1ca-da06-4a53-bbb1-7040a6237b28-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:42 crc kubenswrapper[4766]: I1209 03:13:42.989177 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef10d1ca-da06-4a53-bbb1-7040a6237b28-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4vgsx\" (UID: \"ef10d1ca-da06-4a53-bbb1-7040a6237b28\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:43 crc kubenswrapper[4766]: I1209 03:13:43.022261 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.022233717 podStartE2EDuration="1m27.022233717s" podCreationTimestamp="2025-12-09 03:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:43.022127514 +0000 UTC m=+104.731432950" watchObservedRunningTime="2025-12-09 03:13:43.022233717 +0000 UTC m=+104.731539143" Dec 09 03:13:43 crc kubenswrapper[4766]: I1209 03:13:43.076638 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=12.076608778 podStartE2EDuration="12.076608778s" podCreationTimestamp="2025-12-09 03:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:43.058744005 +0000 UTC m=+104.768049441" watchObservedRunningTime="2025-12-09 03:13:43.076608778 +0000 UTC m=+104.785914204" Dec 09 03:13:43 crc kubenswrapper[4766]: I1209 03:13:43.119122 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=15.119100408 podStartE2EDuration="15.119100408s" podCreationTimestamp="2025-12-09 03:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:43.118956625 +0000 UTC m=+104.828262081" watchObservedRunningTime="2025-12-09 03:13:43.119100408 +0000 UTC m=+104.828405834" Dec 09 03:13:43 crc kubenswrapper[4766]: I1209 03:13:43.138140 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" Dec 09 03:13:43 crc kubenswrapper[4766]: W1209 03:13:43.159156 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef10d1ca_da06_4a53_bbb1_7040a6237b28.slice/crio-a0cf95e06321c1a68a9c2414613b6c654a86824b5ca8acd1d8846dc1a28c2813 WatchSource:0}: Error finding container a0cf95e06321c1a68a9c2414613b6c654a86824b5ca8acd1d8846dc1a28c2813: Status 404 returned error can't find the container with id a0cf95e06321c1a68a9c2414613b6c654a86824b5ca8acd1d8846dc1a28c2813 Dec 09 03:13:43 crc kubenswrapper[4766]: I1209 03:13:43.176080 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xm6zk" podStartSLOduration=80.176056931 podStartE2EDuration="1m20.176056931s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:43.141564127 +0000 UTC m=+104.850869563" watchObservedRunningTime="2025-12-09 03:13:43.176056931 +0000 UTC m=+104.885362357" Dec 09 03:13:43 crc kubenswrapper[4766]: I1209 03:13:43.193521 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l7fcf" podStartSLOduration=80.193498223 podStartE2EDuration="1m20.193498223s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:43.192680301 +0000 UTC m=+104.901985727" watchObservedRunningTime="2025-12-09 03:13:43.193498223 +0000 UTC m=+104.902803649" Dec 09 03:13:43 crc kubenswrapper[4766]: I1209 03:13:43.261248 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" event={"ID":"ef10d1ca-da06-4a53-bbb1-7040a6237b28","Type":"ContainerStarted","Data":"a0cf95e06321c1a68a9c2414613b6c654a86824b5ca8acd1d8846dc1a28c2813"} Dec 09 03:13:43 crc kubenswrapper[4766]: I1209 03:13:43.838572 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:43 crc kubenswrapper[4766]: E1209 03:13:43.838767 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:44 crc kubenswrapper[4766]: I1209 03:13:44.266620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" event={"ID":"ef10d1ca-da06-4a53-bbb1-7040a6237b28","Type":"ContainerStarted","Data":"684f88b3a138cedcef886ad1df61b2873809081bbb6f1b22b29b22cd5fb274c5"} Dec 09 03:13:44 crc kubenswrapper[4766]: I1209 03:13:44.281455 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=86.281422346 podStartE2EDuration="1m26.281422346s" podCreationTimestamp="2025-12-09 03:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:43.222446436 +0000 UTC m=+104.931751862" watchObservedRunningTime="2025-12-09 03:13:44.281422346 +0000 UTC m=+105.990727792" Dec 09 03:13:44 crc kubenswrapper[4766]: I1209 03:13:44.281957 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4vgsx" podStartSLOduration=81.28195178 podStartE2EDuration="1m21.28195178s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:13:44.281101407 +0000 UTC m=+105.990406853" watchObservedRunningTime="2025-12-09 03:13:44.28195178 +0000 UTC m=+105.991257206" Dec 09 03:13:44 crc kubenswrapper[4766]: I1209 03:13:44.838419 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:44 crc kubenswrapper[4766]: E1209 03:13:44.839085 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:44 crc kubenswrapper[4766]: I1209 03:13:44.839846 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:44 crc kubenswrapper[4766]: E1209 03:13:44.840126 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:44 crc kubenswrapper[4766]: I1209 03:13:44.840482 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:44 crc kubenswrapper[4766]: E1209 03:13:44.840808 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:45 crc kubenswrapper[4766]: I1209 03:13:45.838740 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:45 crc kubenswrapper[4766]: E1209 03:13:45.838940 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:46 crc kubenswrapper[4766]: I1209 03:13:46.838557 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:46 crc kubenswrapper[4766]: I1209 03:13:46.838526 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:46 crc kubenswrapper[4766]: E1209 03:13:46.839444 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:46 crc kubenswrapper[4766]: I1209 03:13:46.839760 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:46 crc kubenswrapper[4766]: E1209 03:13:46.840067 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:46 crc kubenswrapper[4766]: E1209 03:13:46.840394 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:47 crc kubenswrapper[4766]: I1209 03:13:47.839353 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:47 crc kubenswrapper[4766]: E1209 03:13:47.839539 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:48 crc kubenswrapper[4766]: I1209 03:13:48.838683 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:48 crc kubenswrapper[4766]: I1209 03:13:48.838725 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:48 crc kubenswrapper[4766]: E1209 03:13:48.840004 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:48 crc kubenswrapper[4766]: I1209 03:13:48.840109 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:48 crc kubenswrapper[4766]: E1209 03:13:48.840627 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:48 crc kubenswrapper[4766]: E1209 03:13:48.840324 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:49 crc kubenswrapper[4766]: I1209 03:13:49.838300 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:49 crc kubenswrapper[4766]: E1209 03:13:49.838534 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:50 crc kubenswrapper[4766]: I1209 03:13:50.839020 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:50 crc kubenswrapper[4766]: E1209 03:13:50.839171 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:50 crc kubenswrapper[4766]: I1209 03:13:50.839043 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:50 crc kubenswrapper[4766]: I1209 03:13:50.839237 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:50 crc kubenswrapper[4766]: E1209 03:13:50.839488 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:50 crc kubenswrapper[4766]: E1209 03:13:50.839583 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:51 crc kubenswrapper[4766]: I1209 03:13:51.839019 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:51 crc kubenswrapper[4766]: E1209 03:13:51.839185 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:51 crc kubenswrapper[4766]: I1209 03:13:51.840546 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:13:51 crc kubenswrapper[4766]: E1209 03:13:51.840947 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-62t52_openshift-ovn-kubernetes(d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec)\"" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" Dec 09 03:13:52 crc kubenswrapper[4766]: I1209 03:13:52.838356 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:52 crc kubenswrapper[4766]: E1209 03:13:52.838604 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:52 crc kubenswrapper[4766]: I1209 03:13:52.839070 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:52 crc kubenswrapper[4766]: E1209 03:13:52.839418 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:52 crc kubenswrapper[4766]: I1209 03:13:52.839085 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:52 crc kubenswrapper[4766]: E1209 03:13:52.840487 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:53 crc kubenswrapper[4766]: I1209 03:13:53.839182 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:53 crc kubenswrapper[4766]: E1209 03:13:53.839445 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:54 crc kubenswrapper[4766]: I1209 03:13:54.860536 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:54 crc kubenswrapper[4766]: E1209 03:13:54.860665 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:54 crc kubenswrapper[4766]: I1209 03:13:54.860660 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:54 crc kubenswrapper[4766]: I1209 03:13:54.860778 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:54 crc kubenswrapper[4766]: E1209 03:13:54.860960 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:54 crc kubenswrapper[4766]: E1209 03:13:54.861174 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:55 crc kubenswrapper[4766]: I1209 03:13:55.838730 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:55 crc kubenswrapper[4766]: E1209 03:13:55.838920 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:56 crc kubenswrapper[4766]: I1209 03:13:56.838571 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:56 crc kubenswrapper[4766]: I1209 03:13:56.838648 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:56 crc kubenswrapper[4766]: I1209 03:13:56.838806 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:56 crc kubenswrapper[4766]: E1209 03:13:56.838961 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:56 crc kubenswrapper[4766]: E1209 03:13:56.839483 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:56 crc kubenswrapper[4766]: E1209 03:13:56.839685 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:57 crc kubenswrapper[4766]: I1209 03:13:57.317208 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/1.log" Dec 09 03:13:57 crc kubenswrapper[4766]: I1209 03:13:57.318400 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/0.log" Dec 09 03:13:57 crc kubenswrapper[4766]: I1209 03:13:57.318467 4766 generic.go:334] "Generic (PLEG): container finished" podID="c83a9d31-9c87-4a13-ab9a-2992e852eb47" containerID="1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4" exitCode=1 Dec 09 03:13:57 crc kubenswrapper[4766]: I1209 03:13:57.318515 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gx9l2" event={"ID":"c83a9d31-9c87-4a13-ab9a-2992e852eb47","Type":"ContainerDied","Data":"1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4"} Dec 09 03:13:57 crc kubenswrapper[4766]: I1209 03:13:57.318572 4766 scope.go:117] "RemoveContainer" containerID="d4b6a4e9421efcac1c56e082b5d65ff45812165a4c5b2d259ebd6ce49d870ee0" Dec 09 03:13:57 crc kubenswrapper[4766]: I1209 03:13:57.319207 4766 scope.go:117] "RemoveContainer" containerID="1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4" Dec 09 03:13:57 crc kubenswrapper[4766]: E1209 03:13:57.319559 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-gx9l2_openshift-multus(c83a9d31-9c87-4a13-ab9a-2992e852eb47)\"" pod="openshift-multus/multus-gx9l2" podUID="c83a9d31-9c87-4a13-ab9a-2992e852eb47" Dec 09 03:13:57 crc kubenswrapper[4766]: I1209 03:13:57.839167 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:57 crc kubenswrapper[4766]: E1209 03:13:57.839344 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:13:58 crc kubenswrapper[4766]: I1209 03:13:58.325445 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/1.log" Dec 09 03:13:58 crc kubenswrapper[4766]: E1209 03:13:58.819290 4766 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 09 03:13:58 crc kubenswrapper[4766]: I1209 03:13:58.838815 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:13:58 crc kubenswrapper[4766]: E1209 03:13:58.840304 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:13:58 crc kubenswrapper[4766]: I1209 03:13:58.840434 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:13:58 crc kubenswrapper[4766]: I1209 03:13:58.840461 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:13:58 crc kubenswrapper[4766]: E1209 03:13:58.840762 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:13:58 crc kubenswrapper[4766]: E1209 03:13:58.840899 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:13:58 crc kubenswrapper[4766]: E1209 03:13:58.941907 4766 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 03:13:59 crc kubenswrapper[4766]: I1209 03:13:59.838140 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:13:59 crc kubenswrapper[4766]: E1209 03:13:59.838401 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:14:00 crc kubenswrapper[4766]: I1209 03:14:00.838337 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:00 crc kubenswrapper[4766]: I1209 03:14:00.838408 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:00 crc kubenswrapper[4766]: I1209 03:14:00.838337 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:00 crc kubenswrapper[4766]: E1209 03:14:00.838536 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:14:00 crc kubenswrapper[4766]: E1209 03:14:00.838721 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:14:00 crc kubenswrapper[4766]: E1209 03:14:00.838895 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:14:01 crc kubenswrapper[4766]: I1209 03:14:01.838578 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:01 crc kubenswrapper[4766]: E1209 03:14:01.838891 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:14:02 crc kubenswrapper[4766]: I1209 03:14:02.838371 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:02 crc kubenswrapper[4766]: I1209 03:14:02.838439 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:02 crc kubenswrapper[4766]: E1209 03:14:02.838592 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:14:02 crc kubenswrapper[4766]: I1209 03:14:02.838389 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:02 crc kubenswrapper[4766]: E1209 03:14:02.838727 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:14:02 crc kubenswrapper[4766]: E1209 03:14:02.838986 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:14:03 crc kubenswrapper[4766]: I1209 03:14:03.838386 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:03 crc kubenswrapper[4766]: E1209 03:14:03.838612 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:14:03 crc kubenswrapper[4766]: I1209 03:14:03.841083 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:14:03 crc kubenswrapper[4766]: E1209 03:14:03.943755 4766 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 03:14:04 crc kubenswrapper[4766]: I1209 03:14:04.350581 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/3.log" Dec 09 03:14:04 crc kubenswrapper[4766]: I1209 03:14:04.354354 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerStarted","Data":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} Dec 09 03:14:04 crc kubenswrapper[4766]: I1209 03:14:04.354798 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:14:04 crc kubenswrapper[4766]: I1209 03:14:04.388069 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podStartSLOduration=101.388042652 podStartE2EDuration="1m41.388042652s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:04.386330095 +0000 UTC m=+126.095635531" watchObservedRunningTime="2025-12-09 03:14:04.388042652 +0000 UTC m=+126.097348088" Dec 09 03:14:04 crc kubenswrapper[4766]: I1209 03:14:04.679502 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z6qth"] Dec 09 03:14:04 crc kubenswrapper[4766]: I1209 03:14:04.679654 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:04 crc kubenswrapper[4766]: E1209 03:14:04.679786 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:14:04 crc kubenswrapper[4766]: I1209 03:14:04.838516 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:04 crc kubenswrapper[4766]: I1209 03:14:04.838652 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:04 crc kubenswrapper[4766]: E1209 03:14:04.838690 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:14:04 crc kubenswrapper[4766]: I1209 03:14:04.838754 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:04 crc kubenswrapper[4766]: E1209 03:14:04.838917 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:14:04 crc kubenswrapper[4766]: E1209 03:14:04.839014 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:14:06 crc kubenswrapper[4766]: I1209 03:14:06.838640 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:06 crc kubenswrapper[4766]: I1209 03:14:06.838671 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:06 crc kubenswrapper[4766]: E1209 03:14:06.839458 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:14:06 crc kubenswrapper[4766]: I1209 03:14:06.838829 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:06 crc kubenswrapper[4766]: E1209 03:14:06.839560 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:14:06 crc kubenswrapper[4766]: I1209 03:14:06.838766 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:06 crc kubenswrapper[4766]: E1209 03:14:06.839639 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:14:06 crc kubenswrapper[4766]: E1209 03:14:06.839780 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:14:07 crc kubenswrapper[4766]: I1209 03:14:07.838959 4766 scope.go:117] "RemoveContainer" containerID="1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4" Dec 09 03:14:08 crc kubenswrapper[4766]: I1209 03:14:08.373575 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/1.log" Dec 09 03:14:08 crc kubenswrapper[4766]: I1209 03:14:08.373644 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gx9l2" event={"ID":"c83a9d31-9c87-4a13-ab9a-2992e852eb47","Type":"ContainerStarted","Data":"25bcb6c0cf26212f22beb70f607504f6c683f40eb583a1e29ff90970937bad7a"} Dec 09 03:14:08 crc kubenswrapper[4766]: I1209 03:14:08.839147 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:08 crc kubenswrapper[4766]: I1209 03:14:08.839535 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:08 crc kubenswrapper[4766]: I1209 03:14:08.839615 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:08 crc kubenswrapper[4766]: I1209 03:14:08.839628 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:08 crc kubenswrapper[4766]: E1209 03:14:08.841499 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:14:08 crc kubenswrapper[4766]: E1209 03:14:08.841621 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:14:08 crc kubenswrapper[4766]: E1209 03:14:08.841895 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:14:08 crc kubenswrapper[4766]: E1209 03:14:08.842034 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:14:08 crc kubenswrapper[4766]: E1209 03:14:08.944434 4766 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 03:14:10 crc kubenswrapper[4766]: I1209 03:14:10.839047 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:10 crc kubenswrapper[4766]: I1209 03:14:10.839157 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:10 crc kubenswrapper[4766]: I1209 03:14:10.839303 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:10 crc kubenswrapper[4766]: E1209 03:14:10.839311 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:14:10 crc kubenswrapper[4766]: E1209 03:14:10.839454 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:14:10 crc kubenswrapper[4766]: E1209 03:14:10.839551 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:14:10 crc kubenswrapper[4766]: I1209 03:14:10.839609 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:10 crc kubenswrapper[4766]: E1209 03:14:10.839689 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:14:12 crc kubenswrapper[4766]: I1209 03:14:12.838929 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:12 crc kubenswrapper[4766]: I1209 03:14:12.839003 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:12 crc kubenswrapper[4766]: I1209 03:14:12.838929 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:12 crc kubenswrapper[4766]: I1209 03:14:12.839017 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:12 crc kubenswrapper[4766]: E1209 03:14:12.839167 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 03:14:12 crc kubenswrapper[4766]: E1209 03:14:12.839468 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 03:14:12 crc kubenswrapper[4766]: E1209 03:14:12.839880 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 03:14:12 crc kubenswrapper[4766]: E1209 03:14:12.840119 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6qth" podUID="5c9eb693-99eb-4b02-b33a-26d506eeb3f1" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.838815 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.838950 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.839048 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.839074 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.842850 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.843187 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.844555 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.844945 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.844956 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 03:14:14 crc kubenswrapper[4766]: I1209 03:14:14.845255 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.749301 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.808411 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n6tj2"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.809919 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fbnmb"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.809821 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.811841 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m8mqz"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.812850 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.812996 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.813493 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.814112 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.815492 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxqs4"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.816263 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.819435 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.819737 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.820240 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.821374 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.821584 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.821783 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.822386 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.823872 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.824574 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.824848 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.825037 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.825776 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.826105 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.827099 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.829425 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.830581 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.831409 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.831438 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.832082 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.832152 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.832095 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.832867 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.833638 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.833792 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.833842 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.833791 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.834335 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.834384 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.834474 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.834515 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.843671 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.845009 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.845496 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.846193 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.846510 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.846693 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.846896 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.852552 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.853067 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.852591 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.854083 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.855018 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894129 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ss24g"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894312 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894597 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894672 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894764 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894799 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894847 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894948 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894590 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894950 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895163 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895188 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.894872 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895133 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895325 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895382 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895456 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895498 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895529 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895579 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895499 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895643 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895751 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895805 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.895951 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896079 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896124 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-config\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896155 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896195 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56991f65-9178-42a0-ba48-3a53256cd715-config\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896240 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c633b9b3-8368-4de4-94c9-a1220ec6f07e-node-pullsecrets\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896285 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-client-ca\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896325 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-etcd-serving-ca\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896363 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/48920893-23b6-4a46-9d8b-207acc99c16f-encryption-config\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896387 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-audit\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896414 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896441 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48920893-23b6-4a46-9d8b-207acc99c16f-serving-cert\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896468 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896495 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/56991f65-9178-42a0-ba48-3a53256cd715-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896526 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896553 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-config\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896581 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-client-ca\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896615 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882f22c6-5509-4647-a337-121cca0e1622-serving-cert\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896641 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-machine-approver-tls\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896666 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-audit-policies\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896691 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l7zt\" (UniqueName: \"kubernetes.io/projected/2f2facf5-7977-44e9-beea-141276d212a5-kube-api-access-2l7zt\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896716 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896764 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e870077d-3b39-484c-a3c7-3b3fdd81e92e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-55tcm\" (UID: \"e870077d-3b39-484c-a3c7-3b3fdd81e92e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896792 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s8z7\" (UniqueName: \"kubernetes.io/projected/c633b9b3-8368-4de4-94c9-a1220ec6f07e-kube-api-access-7s8z7\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896818 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c633b9b3-8368-4de4-94c9-a1220ec6f07e-etcd-client\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896844 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48920893-23b6-4a46-9d8b-207acc99c16f-audit-dir\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896868 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-image-import-ca\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896893 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-config\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896921 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c633b9b3-8368-4de4-94c9-a1220ec6f07e-serving-cert\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896948 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896973 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c633b9b3-8368-4de4-94c9-a1220ec6f07e-audit-dir\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.896997 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/48920893-23b6-4a46-9d8b-207acc99c16f-etcd-client\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897034 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f2facf5-7977-44e9-beea-141276d212a5-audit-dir\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897058 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897082 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whr7q\" (UniqueName: \"kubernetes.io/projected/e870077d-3b39-484c-a3c7-3b3fdd81e92e-kube-api-access-whr7q\") pod \"openshift-apiserver-operator-796bbdcf4f-55tcm\" (UID: \"e870077d-3b39-484c-a3c7-3b3fdd81e92e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897107 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897132 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7bs\" (UniqueName: \"kubernetes.io/projected/56991f65-9178-42a0-ba48-3a53256cd715-kube-api-access-2d7bs\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897156 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-auth-proxy-config\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897178 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-kube-api-access-t2jnx\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897204 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897248 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c633b9b3-8368-4de4-94c9-a1220ec6f07e-encryption-config\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897281 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/48920893-23b6-4a46-9d8b-207acc99c16f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897309 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897337 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/48920893-23b6-4a46-9d8b-207acc99c16f-audit-policies\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897362 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4fnc\" (UniqueName: \"kubernetes.io/projected/21953e96-95ef-438b-a25a-e70f7ad6f7be-kube-api-access-w4fnc\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.897392 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21953e96-95ef-438b-a25a-e70f7ad6f7be-serving-cert\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.898963 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.904237 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n6tj2"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.908047 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.910086 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/56991f65-9178-42a0-ba48-3a53256cd715-images\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.910163 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48920893-23b6-4a46-9d8b-207acc99c16f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.910301 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-config\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.910462 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4phpw\" (UniqueName: \"kubernetes.io/projected/48920893-23b6-4a46-9d8b-207acc99c16f-kube-api-access-4phpw\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.910580 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e870077d-3b39-484c-a3c7-3b3fdd81e92e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-55tcm\" (UID: \"e870077d-3b39-484c-a3c7-3b3fdd81e92e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.910629 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bqxn\" (UniqueName: \"kubernetes.io/projected/882f22c6-5509-4647-a337-121cca0e1622-kube-api-access-2bqxn\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.910700 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.910737 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.911017 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.911197 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fbnmb"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.911257 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.911481 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.911617 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.911787 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.911941 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.912035 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.912123 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.912179 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.912350 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.912469 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.913235 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.913461 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.913884 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-fpj7h"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.914335 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fpj7h" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.914607 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.917421 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.922231 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vfvvg"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.922874 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m8mqz"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.922997 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.924359 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.928982 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.929028 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.929173 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.929338 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.929471 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.929671 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.929854 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.929981 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.933727 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.936327 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tfx22"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.936708 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.936960 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.938689 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.939253 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.954430 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.954643 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.955111 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.956065 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.959317 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.975931 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.975992 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.977311 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.977501 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.977707 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.978509 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.978625 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.978683 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.978901 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.979380 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.980048 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.980325 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.980561 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.980664 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.985385 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.985737 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.985912 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.986077 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c75f2"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.986236 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.986300 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.986614 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.986689 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.986734 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.986623 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.986684 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.986676 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.988092 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.988740 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.989433 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8f855"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.990280 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.993934 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.994194 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.994451 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.995887 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.996158 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.996857 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.998058 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp"] Dec 09 03:14:23 crc kubenswrapper[4766]: I1209 03:14:23.998494 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.000682 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.001364 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.002731 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9wx6d"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.003721 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.009400 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.010692 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011344 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011527 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011574 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c633b9b3-8368-4de4-94c9-a1220ec6f07e-audit-dir\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011610 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/48920893-23b6-4a46-9d8b-207acc99c16f-etcd-client\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011641 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f2facf5-7977-44e9-beea-141276d212a5-audit-dir\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011665 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011691 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whr7q\" (UniqueName: \"kubernetes.io/projected/e870077d-3b39-484c-a3c7-3b3fdd81e92e-kube-api-access-whr7q\") pod \"openshift-apiserver-operator-796bbdcf4f-55tcm\" (UID: \"e870077d-3b39-484c-a3c7-3b3fdd81e92e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011720 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011699 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f2facf5-7977-44e9-beea-141276d212a5-audit-dir\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011747 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2227c3d3-0ba1-4f48-969e-6572a2ef618e-metrics-tls\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011856 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-serving-cert\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.011938 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2483b49-abc4-447a-9896-90574a532a13-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zxkrl\" (UID: \"d2483b49-abc4-447a-9896-90574a532a13\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012141 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7bs\" (UniqueName: \"kubernetes.io/projected/56991f65-9178-42a0-ba48-3a53256cd715-kube-api-access-2d7bs\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012348 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-auth-proxy-config\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012389 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-kube-api-access-t2jnx\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012429 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012460 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c633b9b3-8368-4de4-94c9-a1220ec6f07e-encryption-config\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012523 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/48920893-23b6-4a46-9d8b-207acc99c16f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012580 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012620 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-oauth-config\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012648 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z765j\" (UniqueName: \"kubernetes.io/projected/9577de09-1a6c-4b95-9d9f-225ad2e6be36-kube-api-access-z765j\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012679 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/48920893-23b6-4a46-9d8b-207acc99c16f-audit-policies\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012706 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4fnc\" (UniqueName: \"kubernetes.io/projected/21953e96-95ef-438b-a25a-e70f7ad6f7be-kube-api-access-w4fnc\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012749 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21953e96-95ef-438b-a25a-e70f7ad6f7be-serving-cert\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012779 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/56991f65-9178-42a0-ba48-3a53256cd715-images\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012838 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zndlh\" (UniqueName: \"kubernetes.io/projected/b36ce791-255f-4a9e-9f2a-f1dc62b13892-kube-api-access-zndlh\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012887 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48920893-23b6-4a46-9d8b-207acc99c16f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012927 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-config\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012957 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6c839bfc-fc6c-4a74-a879-c9d24b335eb5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kmq9l\" (UID: \"6c839bfc-fc6c-4a74-a879-c9d24b335eb5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.012994 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9577de09-1a6c-4b95-9d9f-225ad2e6be36-trusted-ca\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013043 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4phpw\" (UniqueName: \"kubernetes.io/projected/48920893-23b6-4a46-9d8b-207acc99c16f-kube-api-access-4phpw\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013078 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e870077d-3b39-484c-a3c7-3b3fdd81e92e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-55tcm\" (UID: \"e870077d-3b39-484c-a3c7-3b3fdd81e92e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013107 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bqxn\" (UniqueName: \"kubernetes.io/projected/882f22c6-5509-4647-a337-121cca0e1622-kube-api-access-2bqxn\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013138 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013202 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013251 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-config\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013281 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013311 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf478\" (UniqueName: \"kubernetes.io/projected/34120810-df87-4443-a2f7-16982e46027d-kube-api-access-lf478\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013336 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b36ce791-255f-4a9e-9f2a-f1dc62b13892-serving-cert\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013385 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56991f65-9178-42a0-ba48-3a53256cd715-config\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013432 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c633b9b3-8368-4de4-94c9-a1220ec6f07e-node-pullsecrets\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013475 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-client-ca\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013515 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-etcd-serving-ca\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013561 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b36ce791-255f-4a9e-9f2a-f1dc62b13892-config\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013605 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/48920893-23b6-4a46-9d8b-207acc99c16f-encryption-config\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013630 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-audit\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013659 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbjfv\" (UniqueName: \"kubernetes.io/projected/73cae1ca-ac0e-4f90-92ec-d4077800d063-kube-api-access-fbjfv\") pod \"downloads-7954f5f757-fpj7h\" (UID: \"73cae1ca-ac0e-4f90-92ec-d4077800d063\") " pod="openshift-console/downloads-7954f5f757-fpj7h" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013692 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013719 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blpcf\" (UniqueName: \"kubernetes.io/projected/2227c3d3-0ba1-4f48-969e-6572a2ef618e-kube-api-access-blpcf\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013743 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9577de09-1a6c-4b95-9d9f-225ad2e6be36-config\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013773 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-console-config\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013794 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-oauth-serving-cert\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013817 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c839bfc-fc6c-4a74-a879-c9d24b335eb5-serving-cert\") pod \"openshift-config-operator-7777fb866f-kmq9l\" (UID: \"6c839bfc-fc6c-4a74-a879-c9d24b335eb5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013843 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48920893-23b6-4a46-9d8b-207acc99c16f-serving-cert\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013892 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013941 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/56991f65-9178-42a0-ba48-3a53256cd715-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.013980 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2227c3d3-0ba1-4f48-969e-6572a2ef618e-trusted-ca\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014014 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014050 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-config\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014084 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2227c3d3-0ba1-4f48-969e-6572a2ef618e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014118 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9577de09-1a6c-4b95-9d9f-225ad2e6be36-serving-cert\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014155 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-client-ca\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014192 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-trusted-ca-bundle\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014228 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014267 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882f22c6-5509-4647-a337-121cca0e1622-serving-cert\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014302 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-machine-approver-tls\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014333 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ce791-255f-4a9e-9f2a-f1dc62b13892-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014362 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-audit-policies\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014385 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l7zt\" (UniqueName: \"kubernetes.io/projected/2f2facf5-7977-44e9-beea-141276d212a5-kube-api-access-2l7zt\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014418 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014464 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e870077d-3b39-484c-a3c7-3b3fdd81e92e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-55tcm\" (UID: \"e870077d-3b39-484c-a3c7-3b3fdd81e92e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014488 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s8z7\" (UniqueName: \"kubernetes.io/projected/c633b9b3-8368-4de4-94c9-a1220ec6f07e-kube-api-access-7s8z7\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014513 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmsvc\" (UniqueName: \"kubernetes.io/projected/6c839bfc-fc6c-4a74-a879-c9d24b335eb5-kube-api-access-cmsvc\") pod \"openshift-config-operator-7777fb866f-kmq9l\" (UID: \"6c839bfc-fc6c-4a74-a879-c9d24b335eb5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014541 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c633b9b3-8368-4de4-94c9-a1220ec6f07e-etcd-client\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014563 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ce791-255f-4a9e-9f2a-f1dc62b13892-service-ca-bundle\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014592 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48920893-23b6-4a46-9d8b-207acc99c16f-audit-dir\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014614 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-image-import-ca\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-service-ca\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014662 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9wtj\" (UniqueName: \"kubernetes.io/projected/d2483b49-abc4-447a-9896-90574a532a13-kube-api-access-f9wtj\") pod \"cluster-samples-operator-665b6dd947-zxkrl\" (UID: \"d2483b49-abc4-447a-9896-90574a532a13\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014696 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-config\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014723 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c633b9b3-8368-4de4-94c9-a1220ec6f07e-serving-cert\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.014895 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.015802 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/56991f65-9178-42a0-ba48-3a53256cd715-images\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.016247 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.016624 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/48920893-23b6-4a46-9d8b-207acc99c16f-audit-policies\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.016780 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.017126 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.018991 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.019047 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/48920893-23b6-4a46-9d8b-207acc99c16f-etcd-client\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.019110 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c633b9b3-8368-4de4-94c9-a1220ec6f07e-serving-cert\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.019278 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c633b9b3-8368-4de4-94c9-a1220ec6f07e-audit-dir\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.019415 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.020027 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/48920893-23b6-4a46-9d8b-207acc99c16f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.020055 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56991f65-9178-42a0-ba48-3a53256cd715-config\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.020148 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c633b9b3-8368-4de4-94c9-a1220ec6f07e-node-pullsecrets\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.020448 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.020639 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4p5j6"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.020898 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48920893-23b6-4a46-9d8b-207acc99c16f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.020931 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-client-ca\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.020922 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.021338 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.021709 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e870077d-3b39-484c-a3c7-3b3fdd81e92e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-55tcm\" (UID: \"e870077d-3b39-484c-a3c7-3b3fdd81e92e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.021866 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.022259 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.034995 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-auth-proxy-config\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.035176 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-config\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.035472 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48920893-23b6-4a46-9d8b-207acc99c16f-serving-cert\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.035537 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.035689 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.035992 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-config\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.036151 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c633b9b3-8368-4de4-94c9-a1220ec6f07e-encryption-config\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.022194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-etcd-serving-ca\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.036597 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxqs4"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.037057 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.037301 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/48920893-23b6-4a46-9d8b-207acc99c16f-encryption-config\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.037311 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-audit\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.037488 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48920893-23b6-4a46-9d8b-207acc99c16f-audit-dir\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.037768 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.038052 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-audit-policies\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.039283 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.039816 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e870077d-3b39-484c-a3c7-3b3fdd81e92e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-55tcm\" (UID: \"e870077d-3b39-484c-a3c7-3b3fdd81e92e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.040410 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-config\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.040614 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c633b9b3-8368-4de4-94c9-a1220ec6f07e-image-import-ca\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.040425 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-client-ca\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.041169 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c633b9b3-8368-4de4-94c9-a1220ec6f07e-etcd-client\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.041470 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.041894 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21953e96-95ef-438b-a25a-e70f7ad6f7be-serving-cert\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.042003 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-config\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.042175 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.042818 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/56991f65-9178-42a0-ba48-3a53256cd715-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.046332 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.047580 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.047706 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882f22c6-5509-4647-a337-121cca0e1622-serving-cert\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.049294 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.049371 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-machine-approver-tls\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.050799 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.051539 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.053616 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.053922 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.055060 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.056651 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swbbr"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.057643 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.057796 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.058578 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.059353 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.060803 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.061053 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gljdx"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.061715 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.063551 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tfx22"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.065063 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6pzhg"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.066309 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.066676 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fpj7h"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.068079 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vfvvg"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.069441 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xrdh9"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.070510 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xrdh9" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.070567 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.071844 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ss24g"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.073870 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.075065 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4p5j6"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.076622 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.077951 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.079510 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.079948 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.081062 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.082253 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.083984 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.085464 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.087062 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.089665 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.091327 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.092416 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xrdh9"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.093287 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.093672 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swbbr"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.094793 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-82txl"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.096123 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-brcxt"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.096205 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.097469 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.097597 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9wx6d"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.098766 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.100666 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.102234 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.103337 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.105107 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6pzhg"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.106535 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.108068 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c75f2"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.109623 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.111540 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gljdx"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.113609 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-brcxt"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.114050 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.115691 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ba986e-97eb-4b03-ac68-e959e127efe6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.115758 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46928ed3-a9cb-4bcf-8647-237ef0418cb8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.115791 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46928ed3-a9cb-4bcf-8647-237ef0418cb8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.115824 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692e07fb-1bec-44f3-8112-4c5d496b7b4a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-phhsp\" (UID: \"692e07fb-1bec-44f3-8112-4c5d496b7b4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.115878 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-oauth-config\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.115900 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z765j\" (UniqueName: \"kubernetes.io/projected/9577de09-1a6c-4b95-9d9f-225ad2e6be36-kube-api-access-z765j\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.115943 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwmzn\" (UniqueName: \"kubernetes.io/projected/46928ed3-a9cb-4bcf-8647-237ef0418cb8-kube-api-access-vwmzn\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.115989 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0581951-bbb8-4550-8ced-7c1431f6deb8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vt8xz\" (UID: \"d0581951-bbb8-4550-8ced-7c1431f6deb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.116539 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zndlh\" (UniqueName: \"kubernetes.io/projected/b36ce791-255f-4a9e-9f2a-f1dc62b13892-kube-api-access-zndlh\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.116657 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/46928ed3-a9cb-4bcf-8647-237ef0418cb8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.116841 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14f5010c-daa1-4ded-ab26-58e171159446-srv-cert\") pod \"catalog-operator-68c6474976-456mp\" (UID: \"14f5010c-daa1-4ded-ab26-58e171159446\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.116905 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6c839bfc-fc6c-4a74-a879-c9d24b335eb5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kmq9l\" (UID: \"6c839bfc-fc6c-4a74-a879-c9d24b335eb5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.116945 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9577de09-1a6c-4b95-9d9f-225ad2e6be36-trusted-ca\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.116998 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfx7n\" (UniqueName: \"kubernetes.io/projected/0af4a108-cb4c-4537-859d-5ca0874be360-kube-api-access-rfx7n\") pod \"olm-operator-6b444d44fb-vjp5g\" (UID: \"0af4a108-cb4c-4537-859d-5ca0874be360\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117066 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f12f58f9-e382-49ce-843a-c9da746a5b80-signing-key\") pod \"service-ca-9c57cc56f-4p5j6\" (UID: \"f12f58f9-e382-49ce-843a-c9da746a5b80\") " pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgftf\" (UniqueName: \"kubernetes.io/projected/f12f58f9-e382-49ce-843a-c9da746a5b80-kube-api-access-kgftf\") pod \"service-ca-9c57cc56f-4p5j6\" (UID: \"f12f58f9-e382-49ce-843a-c9da746a5b80\") " pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117132 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117279 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0af4a108-cb4c-4537-859d-5ca0874be360-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vjp5g\" (UID: \"0af4a108-cb4c-4537-859d-5ca0874be360\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117359 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6c839bfc-fc6c-4a74-a879-c9d24b335eb5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kmq9l\" (UID: \"6c839bfc-fc6c-4a74-a879-c9d24b335eb5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117405 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-tmpfs\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117487 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-webhook-cert\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117559 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt5gl\" (UniqueName: \"kubernetes.io/projected/07235424-e14e-42c1-9508-746e61b3a531-kube-api-access-mt5gl\") pod \"migrator-59844c95c7-jqb5p\" (UID: \"07235424-e14e-42c1-9508-746e61b3a531\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117561 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-82txl"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117643 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9d7\" (UniqueName: \"kubernetes.io/projected/14f5010c-daa1-4ded-ab26-58e171159446-kube-api-access-5j9d7\") pod \"catalog-operator-68c6474976-456mp\" (UID: \"14f5010c-daa1-4ded-ab26-58e171159446\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117690 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf478\" (UniqueName: \"kubernetes.io/projected/34120810-df87-4443-a2f7-16982e46027d-kube-api-access-lf478\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117732 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b36ce791-255f-4a9e-9f2a-f1dc62b13892-serving-cert\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117780 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b36ce791-255f-4a9e-9f2a-f1dc62b13892-config\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117811 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ba986e-97eb-4b03-ac68-e959e127efe6-config\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117845 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rhtm\" (UniqueName: \"kubernetes.io/projected/d0581951-bbb8-4550-8ced-7c1431f6deb8-kube-api-access-4rhtm\") pod \"control-plane-machine-set-operator-78cbb6b69f-vt8xz\" (UID: \"d0581951-bbb8-4550-8ced-7c1431f6deb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117874 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqxwp\" (UniqueName: \"kubernetes.io/projected/e2bcd744-9d24-446e-b445-b32b262e3093-kube-api-access-kqxwp\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhkj2\" (UID: \"e2bcd744-9d24-446e-b445-b32b262e3093\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117895 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14f5010c-daa1-4ded-ab26-58e171159446-profile-collector-cert\") pod \"catalog-operator-68c6474976-456mp\" (UID: \"14f5010c-daa1-4ded-ab26-58e171159446\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.117987 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbjfv\" (UniqueName: \"kubernetes.io/projected/73cae1ca-ac0e-4f90-92ec-d4077800d063-kube-api-access-fbjfv\") pod \"downloads-7954f5f757-fpj7h\" (UID: \"73cae1ca-ac0e-4f90-92ec-d4077800d063\") " pod="openshift-console/downloads-7954f5f757-fpj7h" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.118044 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0af4a108-cb4c-4537-859d-5ca0874be360-srv-cert\") pod \"olm-operator-6b444d44fb-vjp5g\" (UID: \"0af4a108-cb4c-4537-859d-5ca0874be360\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.118076 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9577de09-1a6c-4b95-9d9f-225ad2e6be36-config\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.119343 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9577de09-1a6c-4b95-9d9f-225ad2e6be36-trusted-ca\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.119822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9577de09-1a6c-4b95-9d9f-225ad2e6be36-config\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.119851 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b36ce791-255f-4a9e-9f2a-f1dc62b13892-config\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.118113 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blpcf\" (UniqueName: \"kubernetes.io/projected/2227c3d3-0ba1-4f48-969e-6572a2ef618e-kube-api-access-blpcf\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120027 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c839bfc-fc6c-4a74-a879-c9d24b335eb5-serving-cert\") pod \"openshift-config-operator-7777fb866f-kmq9l\" (UID: \"6c839bfc-fc6c-4a74-a879-c9d24b335eb5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120093 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-console-config\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120159 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-oauth-serving-cert\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120240 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2227c3d3-0ba1-4f48-969e-6572a2ef618e-trusted-ca\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120281 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/692e07fb-1bec-44f3-8112-4c5d496b7b4a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-phhsp\" (UID: \"692e07fb-1bec-44f3-8112-4c5d496b7b4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120325 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2227c3d3-0ba1-4f48-969e-6572a2ef618e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120374 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9577de09-1a6c-4b95-9d9f-225ad2e6be36-serving-cert\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120685 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-trusted-ca-bundle\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120729 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0ba986e-97eb-4b03-ac68-e959e127efe6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120858 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ce791-255f-4a9e-9f2a-f1dc62b13892-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120901 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2bcd744-9d24-446e-b445-b32b262e3093-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhkj2\" (UID: \"e2bcd744-9d24-446e-b445-b32b262e3093\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.120937 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f12f58f9-e382-49ce-843a-c9da746a5b80-signing-cabundle\") pod \"service-ca-9c57cc56f-4p5j6\" (UID: \"f12f58f9-e382-49ce-843a-c9da746a5b80\") " pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121134 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrk8\" (UniqueName: \"kubernetes.io/projected/692e07fb-1bec-44f3-8112-4c5d496b7b4a-kube-api-access-ngrk8\") pod \"kube-storage-version-migrator-operator-b67b599dd-phhsp\" (UID: \"692e07fb-1bec-44f3-8112-4c5d496b7b4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121196 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glnsl\" (UniqueName: \"kubernetes.io/projected/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-kube-api-access-glnsl\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121316 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmsvc\" (UniqueName: \"kubernetes.io/projected/6c839bfc-fc6c-4a74-a879-c9d24b335eb5-kube-api-access-cmsvc\") pod \"openshift-config-operator-7777fb866f-kmq9l\" (UID: \"6c839bfc-fc6c-4a74-a879-c9d24b335eb5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121619 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ce791-255f-4a9e-9f2a-f1dc62b13892-service-ca-bundle\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121662 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121692 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-service-ca\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121730 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9wtj\" (UniqueName: \"kubernetes.io/projected/d2483b49-abc4-447a-9896-90574a532a13-kube-api-access-f9wtj\") pod \"cluster-samples-operator-665b6dd947-zxkrl\" (UID: \"d2483b49-abc4-447a-9896-90574a532a13\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121773 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kfdv\" (UniqueName: \"kubernetes.io/projected/f839f7fc-1e7f-4c71-9c02-f456ffacb094-kube-api-access-6kfdv\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121837 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bcd744-9d24-446e-b445-b32b262e3093-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhkj2\" (UID: \"e2bcd744-9d24-446e-b445-b32b262e3093\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121884 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-console-config\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121914 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2227c3d3-0ba1-4f48-969e-6572a2ef618e-metrics-tls\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121955 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-serving-cert\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.121990 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2483b49-abc4-447a-9896-90574a532a13-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zxkrl\" (UID: \"d2483b49-abc4-447a-9896-90574a532a13\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.122028 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.122616 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-oauth-serving-cert\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.125346 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-service-ca\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.125489 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-t4ptb"] Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.125696 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9577de09-1a6c-4b95-9d9f-225ad2e6be36-serving-cert\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.126987 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b36ce791-255f-4a9e-9f2a-f1dc62b13892-serving-cert\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.127415 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2227c3d3-0ba1-4f48-969e-6572a2ef618e-trusted-ca\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.127674 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c839bfc-fc6c-4a74-a879-c9d24b335eb5-serving-cert\") pod \"openshift-config-operator-7777fb866f-kmq9l\" (UID: \"6c839bfc-fc6c-4a74-a879-c9d24b335eb5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.127723 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2227c3d3-0ba1-4f48-969e-6572a2ef618e-metrics-tls\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.128921 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-oauth-config\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.129783 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-serving-cert\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.130526 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ce791-255f-4a9e-9f2a-f1dc62b13892-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.132080 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-trusted-ca-bundle\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.133304 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.133638 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2483b49-abc4-447a-9896-90574a532a13-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zxkrl\" (UID: \"d2483b49-abc4-447a-9896-90574a532a13\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.133981 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ce791-255f-4a9e-9f2a-f1dc62b13892-service-ca-bundle\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.135044 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.153852 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.173123 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.193879 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.213932 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223042 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/692e07fb-1bec-44f3-8112-4c5d496b7b4a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-phhsp\" (UID: \"692e07fb-1bec-44f3-8112-4c5d496b7b4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223125 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0ba986e-97eb-4b03-ac68-e959e127efe6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223184 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2bcd744-9d24-446e-b445-b32b262e3093-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhkj2\" (UID: \"e2bcd744-9d24-446e-b445-b32b262e3093\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223237 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glnsl\" (UniqueName: \"kubernetes.io/projected/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-kube-api-access-glnsl\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223264 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f12f58f9-e382-49ce-843a-c9da746a5b80-signing-cabundle\") pod \"service-ca-9c57cc56f-4p5j6\" (UID: \"f12f58f9-e382-49ce-843a-c9da746a5b80\") " pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223286 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrk8\" (UniqueName: \"kubernetes.io/projected/692e07fb-1bec-44f3-8112-4c5d496b7b4a-kube-api-access-ngrk8\") pod \"kube-storage-version-migrator-operator-b67b599dd-phhsp\" (UID: \"692e07fb-1bec-44f3-8112-4c5d496b7b4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223365 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223404 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kfdv\" (UniqueName: \"kubernetes.io/projected/f839f7fc-1e7f-4c71-9c02-f456ffacb094-kube-api-access-6kfdv\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223432 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bcd744-9d24-446e-b445-b32b262e3093-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhkj2\" (UID: \"e2bcd744-9d24-446e-b445-b32b262e3093\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223469 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223504 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ba986e-97eb-4b03-ac68-e959e127efe6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223560 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46928ed3-a9cb-4bcf-8647-237ef0418cb8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223586 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46928ed3-a9cb-4bcf-8647-237ef0418cb8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223616 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692e07fb-1bec-44f3-8112-4c5d496b7b4a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-phhsp\" (UID: \"692e07fb-1bec-44f3-8112-4c5d496b7b4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223672 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwmzn\" (UniqueName: \"kubernetes.io/projected/46928ed3-a9cb-4bcf-8647-237ef0418cb8-kube-api-access-vwmzn\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223700 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0581951-bbb8-4550-8ced-7c1431f6deb8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vt8xz\" (UID: \"d0581951-bbb8-4550-8ced-7c1431f6deb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223741 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/46928ed3-a9cb-4bcf-8647-237ef0418cb8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223765 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14f5010c-daa1-4ded-ab26-58e171159446-srv-cert\") pod \"catalog-operator-68c6474976-456mp\" (UID: \"14f5010c-daa1-4ded-ab26-58e171159446\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223791 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfx7n\" (UniqueName: \"kubernetes.io/projected/0af4a108-cb4c-4537-859d-5ca0874be360-kube-api-access-rfx7n\") pod \"olm-operator-6b444d44fb-vjp5g\" (UID: \"0af4a108-cb4c-4537-859d-5ca0874be360\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223833 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f12f58f9-e382-49ce-843a-c9da746a5b80-signing-key\") pod \"service-ca-9c57cc56f-4p5j6\" (UID: \"f12f58f9-e382-49ce-843a-c9da746a5b80\") " pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223858 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgftf\" (UniqueName: \"kubernetes.io/projected/f12f58f9-e382-49ce-843a-c9da746a5b80-kube-api-access-kgftf\") pod \"service-ca-9c57cc56f-4p5j6\" (UID: \"f12f58f9-e382-49ce-843a-c9da746a5b80\") " pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223888 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223914 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0af4a108-cb4c-4537-859d-5ca0874be360-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vjp5g\" (UID: \"0af4a108-cb4c-4537-859d-5ca0874be360\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223942 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9d7\" (UniqueName: \"kubernetes.io/projected/14f5010c-daa1-4ded-ab26-58e171159446-kube-api-access-5j9d7\") pod \"catalog-operator-68c6474976-456mp\" (UID: \"14f5010c-daa1-4ded-ab26-58e171159446\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223967 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-tmpfs\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.223990 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-webhook-cert\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.224013 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt5gl\" (UniqueName: \"kubernetes.io/projected/07235424-e14e-42c1-9508-746e61b3a531-kube-api-access-mt5gl\") pod \"migrator-59844c95c7-jqb5p\" (UID: \"07235424-e14e-42c1-9508-746e61b3a531\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.224084 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ba986e-97eb-4b03-ac68-e959e127efe6-config\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.224110 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rhtm\" (UniqueName: \"kubernetes.io/projected/d0581951-bbb8-4550-8ced-7c1431f6deb8-kube-api-access-4rhtm\") pod \"control-plane-machine-set-operator-78cbb6b69f-vt8xz\" (UID: \"d0581951-bbb8-4550-8ced-7c1431f6deb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.224134 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqxwp\" (UniqueName: \"kubernetes.io/projected/e2bcd744-9d24-446e-b445-b32b262e3093-kube-api-access-kqxwp\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhkj2\" (UID: \"e2bcd744-9d24-446e-b445-b32b262e3093\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.224156 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14f5010c-daa1-4ded-ab26-58e171159446-profile-collector-cert\") pod \"catalog-operator-68c6474976-456mp\" (UID: \"14f5010c-daa1-4ded-ab26-58e171159446\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.224187 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0af4a108-cb4c-4537-859d-5ca0874be360-srv-cert\") pod \"olm-operator-6b444d44fb-vjp5g\" (UID: \"0af4a108-cb4c-4537-859d-5ca0874be360\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.225546 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-tmpfs\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.225947 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692e07fb-1bec-44f3-8112-4c5d496b7b4a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-phhsp\" (UID: \"692e07fb-1bec-44f3-8112-4c5d496b7b4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.226306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bcd744-9d24-446e-b445-b32b262e3093-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhkj2\" (UID: \"e2bcd744-9d24-446e-b445-b32b262e3093\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.226793 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46928ed3-a9cb-4bcf-8647-237ef0418cb8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.227639 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2bcd744-9d24-446e-b445-b32b262e3093-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhkj2\" (UID: \"e2bcd744-9d24-446e-b445-b32b262e3093\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.229224 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/692e07fb-1bec-44f3-8112-4c5d496b7b4a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-phhsp\" (UID: \"692e07fb-1bec-44f3-8112-4c5d496b7b4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.229560 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/46928ed3-a9cb-4bcf-8647-237ef0418cb8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.233271 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.252677 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.273142 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.293748 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.313261 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.333760 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.355529 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.373582 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.394539 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.412966 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.433840 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.453681 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.473536 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.514012 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.532686 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.568884 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.581194 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.594313 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.614747 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.634237 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.653126 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.673760 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.678530 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14f5010c-daa1-4ded-ab26-58e171159446-srv-cert\") pod \"catalog-operator-68c6474976-456mp\" (UID: \"14f5010c-daa1-4ded-ab26-58e171159446\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.694276 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.699531 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14f5010c-daa1-4ded-ab26-58e171159446-profile-collector-cert\") pod \"catalog-operator-68c6474976-456mp\" (UID: \"14f5010c-daa1-4ded-ab26-58e171159446\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.700627 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0af4a108-cb4c-4537-859d-5ca0874be360-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vjp5g\" (UID: \"0af4a108-cb4c-4537-859d-5ca0874be360\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.713199 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.731635 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:24 crc kubenswrapper[4766]: E1209 03:14:24.731817 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:16:26.731793642 +0000 UTC m=+268.441099088 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.732176 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.732378 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.732949 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.739253 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-webhook-cert\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.741275 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.754272 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.774187 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.793612 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.800247 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0af4a108-cb4c-4537-859d-5ca0874be360-srv-cert\") pod \"olm-operator-6b444d44fb-vjp5g\" (UID: \"0af4a108-cb4c-4537-859d-5ca0874be360\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.833485 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.833840 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.834073 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.837895 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.838630 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.840150 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.843524 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whr7q\" (UniqueName: \"kubernetes.io/projected/e870077d-3b39-484c-a3c7-3b3fdd81e92e-kube-api-access-whr7q\") pod \"openshift-apiserver-operator-796bbdcf4f-55tcm\" (UID: \"e870077d-3b39-484c-a3c7-3b3fdd81e92e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.860404 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7bs\" (UniqueName: \"kubernetes.io/projected/56991f65-9178-42a0-ba48-3a53256cd715-kube-api-access-2d7bs\") pod \"machine-api-operator-5694c8668f-fbnmb\" (UID: \"56991f65-9178-42a0-ba48-3a53256cd715\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.874634 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.880654 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0581951-bbb8-4550-8ced-7c1431f6deb8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vt8xz\" (UID: \"d0581951-bbb8-4550-8ced-7c1431f6deb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.885253 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4fnc\" (UniqueName: \"kubernetes.io/projected/21953e96-95ef-438b-a25a-e70f7ad6f7be-kube-api-access-w4fnc\") pod \"route-controller-manager-6576b87f9c-nvxzs\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.893637 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.901039 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.913543 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.921624 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.961114 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/cc673bbc-4d96-40c5-a0ff-eeb9adfaa274-kube-api-access-t2jnx\") pod \"machine-approver-56656f9798-2fbhb\" (UID: \"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.973542 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 03:14:24 crc kubenswrapper[4766]: I1209 03:14:24.975246 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bqxn\" (UniqueName: \"kubernetes.io/projected/882f22c6-5509-4647-a337-121cca0e1622-kube-api-access-2bqxn\") pod \"controller-manager-879f6c89f-n6tj2\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.013799 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.015294 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f12f58f9-e382-49ce-843a-c9da746a5b80-signing-cabundle\") pod \"service-ca-9c57cc56f-4p5j6\" (UID: \"f12f58f9-e382-49ce-843a-c9da746a5b80\") " pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.033646 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4phpw\" (UniqueName: \"kubernetes.io/projected/48920893-23b6-4a46-9d8b-207acc99c16f-kube-api-access-4phpw\") pod \"apiserver-7bbb656c7d-dxnx5\" (UID: \"48920893-23b6-4a46-9d8b-207acc99c16f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.034207 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.040339 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.051167 4766 request.go:700] Waited for 1.014537965s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/secrets?fieldSelector=metadata.name%3Dsigning-key&limit=500&resourceVersion=0 Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.053369 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.057498 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.063457 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.068748 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f12f58f9-e382-49ce-843a-c9da746a5b80-signing-key\") pod \"service-ca-9c57cc56f-4p5j6\" (UID: \"f12f58f9-e382-49ce-843a-c9da746a5b80\") " pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.073830 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.074025 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.093320 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.093935 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.130758 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.134687 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s8z7\" (UniqueName: \"kubernetes.io/projected/c633b9b3-8368-4de4-94c9-a1220ec6f07e-kube-api-access-7s8z7\") pod \"apiserver-76f77b778f-m8mqz\" (UID: \"c633b9b3-8368-4de4-94c9-a1220ec6f07e\") " pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.152780 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l7zt\" (UniqueName: \"kubernetes.io/projected/2f2facf5-7977-44e9-beea-141276d212a5-kube-api-access-2l7zt\") pod \"oauth-openshift-558db77b4-fxqs4\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.163738 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.175692 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.181085 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.193774 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.207010 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.214343 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: E1209 03:14:25.225769 4766 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Dec 09 03:14:25 crc kubenswrapper[4766]: E1209 03:14:25.227751 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-operator-metrics podName:f839f7fc-1e7f-4c71-9c02-f456ffacb094 nodeName:}" failed. No retries permitted until 2025-12-09 03:14:25.727722797 +0000 UTC m=+147.437028223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-operator-metrics") pod "marketplace-operator-79b997595-swbbr" (UID: "f839f7fc-1e7f-4c71-9c02-f456ffacb094") : failed to sync secret cache: timed out waiting for the condition Dec 09 03:14:25 crc kubenswrapper[4766]: E1209 03:14:25.227930 4766 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Dec 09 03:14:25 crc kubenswrapper[4766]: E1209 03:14:25.228054 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-trusted-ca podName:f839f7fc-1e7f-4c71-9c02-f456ffacb094 nodeName:}" failed. No retries permitted until 2025-12-09 03:14:25.728018795 +0000 UTC m=+147.437324211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-trusted-ca") pod "marketplace-operator-79b997595-swbbr" (UID: "f839f7fc-1e7f-4c71-9c02-f456ffacb094") : failed to sync configmap cache: timed out waiting for the condition Dec 09 03:14:25 crc kubenswrapper[4766]: E1209 03:14:25.228070 4766 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 09 03:14:25 crc kubenswrapper[4766]: E1209 03:14:25.228178 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0ba986e-97eb-4b03-ac68-e959e127efe6-serving-cert podName:a0ba986e-97eb-4b03-ac68-e959e127efe6 nodeName:}" failed. No retries permitted until 2025-12-09 03:14:25.728150888 +0000 UTC m=+147.437456314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a0ba986e-97eb-4b03-ac68-e959e127efe6-serving-cert") pod "kube-apiserver-operator-766d6c64bb-9lkj7" (UID: "a0ba986e-97eb-4b03-ac68-e959e127efe6") : failed to sync secret cache: timed out waiting for the condition Dec 09 03:14:25 crc kubenswrapper[4766]: E1209 03:14:25.228183 4766 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 09 03:14:25 crc kubenswrapper[4766]: E1209 03:14:25.228341 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0ba986e-97eb-4b03-ac68-e959e127efe6-config podName:a0ba986e-97eb-4b03-ac68-e959e127efe6 nodeName:}" failed. No retries permitted until 2025-12-09 03:14:25.728296072 +0000 UTC m=+147.437601498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a0ba986e-97eb-4b03-ac68-e959e127efe6-config") pod "kube-apiserver-operator-766d6c64bb-9lkj7" (UID: "a0ba986e-97eb-4b03-ac68-e959e127efe6") : failed to sync configmap cache: timed out waiting for the condition Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.237055 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.241757 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm"] Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.260643 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs"] Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.262177 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.280922 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.301369 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.314650 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.334624 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.369384 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.373605 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 03:14:25 crc kubenswrapper[4766]: W1209 03:14:25.377054 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-5f862513cd8ba2fec7ec7331d101d28f2b02703ff237e59294578bd896cd971c WatchSource:0}: Error finding container 5f862513cd8ba2fec7ec7331d101d28f2b02703ff237e59294578bd896cd971c: Status 404 returned error can't find the container with id 5f862513cd8ba2fec7ec7331d101d28f2b02703ff237e59294578bd896cd971c Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.399008 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.413005 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.414720 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.415282 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n6tj2"] Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.436099 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.454060 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.463227 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" event={"ID":"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274","Type":"ContainerStarted","Data":"f9090fb664d6689c361abd3958fe950db659c1918fdd66e6fccdaa6ecd12f223"} Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.464825 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5f862513cd8ba2fec7ec7331d101d28f2b02703ff237e59294578bd896cd971c"} Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.466874 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" event={"ID":"21953e96-95ef-438b-a25a-e70f7ad6f7be","Type":"ContainerStarted","Data":"4a9500716b90a39d8e7b53529b8200dd83fdccf57dae0c305cda37b9dfb28aff"} Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.472680 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" event={"ID":"e870077d-3b39-484c-a3c7-3b3fdd81e92e","Type":"ContainerStarted","Data":"fff31eaab1142f91f396939a9c043349ce1d9bcf31e7690ae5efa8ae476f6c29"} Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.479954 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 03:14:25 crc kubenswrapper[4766]: W1209 03:14:25.484506 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod882f22c6_5509_4647_a337_121cca0e1622.slice/crio-7d93a1cad014c8e16fb6a5a9964397014f7d86040adb5fbe7e5b895e5c821bba WatchSource:0}: Error finding container 7d93a1cad014c8e16fb6a5a9964397014f7d86040adb5fbe7e5b895e5c821bba: Status 404 returned error can't find the container with id 7d93a1cad014c8e16fb6a5a9964397014f7d86040adb5fbe7e5b895e5c821bba Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.493092 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.514443 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.531366 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxqs4"] Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.532989 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 03:14:25 crc kubenswrapper[4766]: W1209 03:14:25.541621 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2facf5_7977_44e9_beea_141276d212a5.slice/crio-60199c4b1dd0ad177ebadd1f95967d79953ec307758b6bce22b989f98b55889d WatchSource:0}: Error finding container 60199c4b1dd0ad177ebadd1f95967d79953ec307758b6bce22b989f98b55889d: Status 404 returned error can't find the container with id 60199c4b1dd0ad177ebadd1f95967d79953ec307758b6bce22b989f98b55889d Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.553135 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.554067 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5"] Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.576361 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 03:14:25 crc kubenswrapper[4766]: W1209 03:14:25.594626 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48920893_23b6_4a46_9d8b_207acc99c16f.slice/crio-3ab371041849fc2bbd33f9564f784e39740aa4b7c87002b3cb474fe7547aed84 WatchSource:0}: Error finding container 3ab371041849fc2bbd33f9564f784e39740aa4b7c87002b3cb474fe7547aed84: Status 404 returned error can't find the container with id 3ab371041849fc2bbd33f9564f784e39740aa4b7c87002b3cb474fe7547aed84 Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.594755 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.618236 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.634110 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.656975 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.677227 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fbnmb"] Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.677517 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.693980 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.699819 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m8mqz"] Dec 09 03:14:25 crc kubenswrapper[4766]: W1209 03:14:25.707940 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56991f65_9178_42a0_ba48_3a53256cd715.slice/crio-1e338b8fbb3723076dbcd23fcf1f6769e2d6afbab51dd1018ab9e0f93b0df2fd WatchSource:0}: Error finding container 1e338b8fbb3723076dbcd23fcf1f6769e2d6afbab51dd1018ab9e0f93b0df2fd: Status 404 returned error can't find the container with id 1e338b8fbb3723076dbcd23fcf1f6769e2d6afbab51dd1018ab9e0f93b0df2fd Dec 09 03:14:25 crc kubenswrapper[4766]: W1209 03:14:25.711030 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc633b9b3_8368_4de4_94c9_a1220ec6f07e.slice/crio-fb6d329a8909fdd1698eed942af5270d31bbde08d1453e47461cdee93cb98193 WatchSource:0}: Error finding container fb6d329a8909fdd1698eed942af5270d31bbde08d1453e47461cdee93cb98193: Status 404 returned error can't find the container with id fb6d329a8909fdd1698eed942af5270d31bbde08d1453e47461cdee93cb98193 Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.713753 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.733535 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.753988 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.767642 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.767734 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ba986e-97eb-4b03-ac68-e959e127efe6-config\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.767908 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.767946 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ba986e-97eb-4b03-ac68-e959e127efe6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.769298 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ba986e-97eb-4b03-ac68-e959e127efe6-config\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.770272 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.774556 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.777478 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ba986e-97eb-4b03-ac68-e959e127efe6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.777585 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.796584 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.815028 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.833713 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.854598 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.894199 4766 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.913604 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.934531 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.953575 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.973179 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 03:14:25 crc kubenswrapper[4766]: I1209 03:14:25.998077 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.031773 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z765j\" (UniqueName: \"kubernetes.io/projected/9577de09-1a6c-4b95-9d9f-225ad2e6be36-kube-api-access-z765j\") pod \"console-operator-58897d9998-vfvvg\" (UID: \"9577de09-1a6c-4b95-9d9f-225ad2e6be36\") " pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.051255 4766 request.go:700] Waited for 1.933365789s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.053454 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zndlh\" (UniqueName: \"kubernetes.io/projected/b36ce791-255f-4a9e-9f2a-f1dc62b13892-kube-api-access-zndlh\") pod \"authentication-operator-69f744f599-tfx22\" (UID: \"b36ce791-255f-4a9e-9f2a-f1dc62b13892\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.068105 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf478\" (UniqueName: \"kubernetes.io/projected/34120810-df87-4443-a2f7-16982e46027d-kube-api-access-lf478\") pod \"console-f9d7485db-ss24g\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.087847 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2227c3d3-0ba1-4f48-969e-6572a2ef618e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.110190 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9wtj\" (UniqueName: \"kubernetes.io/projected/d2483b49-abc4-447a-9896-90574a532a13-kube-api-access-f9wtj\") pod \"cluster-samples-operator-665b6dd947-zxkrl\" (UID: \"d2483b49-abc4-447a-9896-90574a532a13\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.127285 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbjfv\" (UniqueName: \"kubernetes.io/projected/73cae1ca-ac0e-4f90-92ec-d4077800d063-kube-api-access-fbjfv\") pod \"downloads-7954f5f757-fpj7h\" (UID: \"73cae1ca-ac0e-4f90-92ec-d4077800d063\") " pod="openshift-console/downloads-7954f5f757-fpj7h" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.150738 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blpcf\" (UniqueName: \"kubernetes.io/projected/2227c3d3-0ba1-4f48-969e-6572a2ef618e-kube-api-access-blpcf\") pod \"ingress-operator-5b745b69d9-8xwh7\" (UID: \"2227c3d3-0ba1-4f48-969e-6572a2ef618e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.170350 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmsvc\" (UniqueName: \"kubernetes.io/projected/6c839bfc-fc6c-4a74-a879-c9d24b335eb5-kube-api-access-cmsvc\") pod \"openshift-config-operator-7777fb866f-kmq9l\" (UID: \"6c839bfc-fc6c-4a74-a879-c9d24b335eb5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.173395 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.176234 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.185377 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.186812 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.197757 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.197867 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fpj7h" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.203529 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.214027 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.215429 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.220604 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.257964 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0ba986e-97eb-4b03-ac68-e959e127efe6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9lkj7\" (UID: \"a0ba986e-97eb-4b03-ac68-e959e127efe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.278982 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glnsl\" (UniqueName: \"kubernetes.io/projected/b1d44c09-a0e8-43de-8405-b0cf0490a2b6-kube-api-access-glnsl\") pod \"packageserver-d55dfcdfc-cfqbb\" (UID: \"b1d44c09-a0e8-43de-8405-b0cf0490a2b6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.312751 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.316726 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46928ed3-a9cb-4bcf-8647-237ef0418cb8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.328851 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrk8\" (UniqueName: \"kubernetes.io/projected/692e07fb-1bec-44f3-8112-4c5d496b7b4a-kube-api-access-ngrk8\") pod \"kube-storage-version-migrator-operator-b67b599dd-phhsp\" (UID: \"692e07fb-1bec-44f3-8112-4c5d496b7b4a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.337032 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kfdv\" (UniqueName: \"kubernetes.io/projected/f839f7fc-1e7f-4c71-9c02-f456ffacb094-kube-api-access-6kfdv\") pod \"marketplace-operator-79b997595-swbbr\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.358686 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt5gl\" (UniqueName: \"kubernetes.io/projected/07235424-e14e-42c1-9508-746e61b3a531-kube-api-access-mt5gl\") pod \"migrator-59844c95c7-jqb5p\" (UID: \"07235424-e14e-42c1-9508-746e61b3a531\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.370102 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9d7\" (UniqueName: \"kubernetes.io/projected/14f5010c-daa1-4ded-ab26-58e171159446-kube-api-access-5j9d7\") pod \"catalog-operator-68c6474976-456mp\" (UID: \"14f5010c-daa1-4ded-ab26-58e171159446\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.378959 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.402913 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwmzn\" (UniqueName: \"kubernetes.io/projected/46928ed3-a9cb-4bcf-8647-237ef0418cb8-kube-api-access-vwmzn\") pod \"cluster-image-registry-operator-dc59b4c8b-fcbp6\" (UID: \"46928ed3-a9cb-4bcf-8647-237ef0418cb8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.403647 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.404386 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.437047 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rhtm\" (UniqueName: \"kubernetes.io/projected/d0581951-bbb8-4550-8ced-7c1431f6deb8-kube-api-access-4rhtm\") pod \"control-plane-machine-set-operator-78cbb6b69f-vt8xz\" (UID: \"d0581951-bbb8-4550-8ced-7c1431f6deb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.438579 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgftf\" (UniqueName: \"kubernetes.io/projected/f12f58f9-e382-49ce-843a-c9da746a5b80-kube-api-access-kgftf\") pod \"service-ca-9c57cc56f-4p5j6\" (UID: \"f12f58f9-e382-49ce-843a-c9da746a5b80\") " pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.449796 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfx7n\" (UniqueName: \"kubernetes.io/projected/0af4a108-cb4c-4537-859d-5ca0874be360-kube-api-access-rfx7n\") pod \"olm-operator-6b444d44fb-vjp5g\" (UID: \"0af4a108-cb4c-4537-859d-5ca0874be360\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.480719 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqxwp\" (UniqueName: \"kubernetes.io/projected/e2bcd744-9d24-446e-b445-b32b262e3093-kube-api-access-kqxwp\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhkj2\" (UID: \"e2bcd744-9d24-446e-b445-b32b262e3093\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.528160 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl"] Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.537486 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.547848 4766 generic.go:334] "Generic (PLEG): container finished" podID="48920893-23b6-4a46-9d8b-207acc99c16f" containerID="f5218619d5ea2db2934639bfdf0a32cbb60ebaaee19fb47d184376837d055150" exitCode=0 Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.547985 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" event={"ID":"48920893-23b6-4a46-9d8b-207acc99c16f","Type":"ContainerDied","Data":"f5218619d5ea2db2934639bfdf0a32cbb60ebaaee19fb47d184376837d055150"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.548016 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" event={"ID":"48920893-23b6-4a46-9d8b-207acc99c16f","Type":"ContainerStarted","Data":"3ab371041849fc2bbd33f9564f784e39740aa4b7c87002b3cb474fe7547aed84"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.549509 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.552252 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580675 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36b72bd7-a396-448c-a04e-7f10fdf20d03-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9wx6d\" (UID: \"36b72bd7-a396-448c-a04e-7f10fdf20d03\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580739 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-etcd-client\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580789 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-secret-volume\") pod \"collect-profiles-29420820-4thjl\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580818 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-config-volume\") pod \"collect-profiles-29420820-4thjl\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580847 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltllc\" (UniqueName: \"kubernetes.io/projected/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-kube-api-access-ltllc\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580877 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxdr\" (UniqueName: \"kubernetes.io/projected/257bfa27-baf5-4470-84f1-71d06b37f763-kube-api-access-chxdr\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580901 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-serving-cert\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580921 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-etcd-service-ca\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580958 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.580988 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcmg4\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-kube-api-access-dcmg4\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581013 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f55j\" (UniqueName: \"kubernetes.io/projected/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-kube-api-access-8f55j\") pod \"collect-profiles-29420820-4thjl\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581038 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00c0f7fd-e9af-4903-9eda-cb613560325a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-94j6f\" (UID: \"00c0f7fd-e9af-4903-9eda-cb613560325a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581063 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00c0f7fd-e9af-4903-9eda-cb613560325a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-94j6f\" (UID: \"00c0f7fd-e9af-4903-9eda-cb613560325a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581093 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bks9j\" (UniqueName: \"kubernetes.io/projected/b7d85e7b-038a-4c18-96d6-4aa4c811ff44-kube-api-access-bks9j\") pod \"package-server-manager-789f6589d5-2djrp\" (UID: \"b7d85e7b-038a-4c18-96d6-4aa4c811ff44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581120 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/257bfa27-baf5-4470-84f1-71d06b37f763-service-ca-bundle\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581161 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-proxy-tls\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581191 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxqmx\" (UniqueName: \"kubernetes.io/projected/ce678949-3cb1-499d-a5a4-a80d9337bb3f-kube-api-access-gxqmx\") pod \"service-ca-operator-777779d784-sj9r4\" (UID: \"ce678949-3cb1-499d-a5a4-a80d9337bb3f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581236 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-etcd-ca\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581266 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-bound-sa-token\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581297 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hkwp\" (UniqueName: \"kubernetes.io/projected/6743e7ca-f9d2-4267-ad68-8a35030cb43c-kube-api-access-6hkwp\") pod \"machine-config-controller-84d6567774-77sxr\" (UID: \"6743e7ca-f9d2-4267-ad68-8a35030cb43c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581322 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a702e81-477c-44a0-909b-520ddc169ca8-metrics-tls\") pod \"dns-operator-744455d44c-c75f2\" (UID: \"6a702e81-477c-44a0-909b-520ddc169ca8\") " pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581358 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e81ed6-e457-4e59-a25e-b40dceea3cfd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581385 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9a28313-282c-4ab6-a49f-e68fee577ba7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-np46v\" (UID: \"c9a28313-282c-4ab6-a49f-e68fee577ba7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581408 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9a28313-282c-4ab6-a49f-e68fee577ba7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-np46v\" (UID: \"c9a28313-282c-4ab6-a49f-e68fee577ba7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581434 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-images\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581457 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8bk2\" (UniqueName: \"kubernetes.io/projected/6a702e81-477c-44a0-909b-520ddc169ca8-kube-api-access-z8bk2\") pod \"dns-operator-744455d44c-c75f2\" (UID: \"6a702e81-477c-44a0-909b-520ddc169ca8\") " pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581491 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581521 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpf24\" (UniqueName: \"kubernetes.io/projected/36b72bd7-a396-448c-a04e-7f10fdf20d03-kube-api-access-gpf24\") pod \"multus-admission-controller-857f4d67dd-9wx6d\" (UID: \"36b72bd7-a396-448c-a04e-7f10fdf20d03\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581548 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c0f7fd-e9af-4903-9eda-cb613560325a-config\") pod \"kube-controller-manager-operator-78b949d7b-94j6f\" (UID: \"00c0f7fd-e9af-4903-9eda-cb613560325a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581569 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6743e7ca-f9d2-4267-ad68-8a35030cb43c-proxy-tls\") pod \"machine-config-controller-84d6567774-77sxr\" (UID: \"6743e7ca-f9d2-4267-ad68-8a35030cb43c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581611 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a28313-282c-4ab6-a49f-e68fee577ba7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-np46v\" (UID: \"c9a28313-282c-4ab6-a49f-e68fee577ba7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581632 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7aaedcf1-1390-4a48-be10-0f34ad7f0082-cert\") pod \"ingress-canary-xrdh9\" (UID: \"7aaedcf1-1390-4a48-be10-0f34ad7f0082\") " pod="openshift-ingress-canary/ingress-canary-xrdh9" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581683 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-tls\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581706 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e81ed6-e457-4e59-a25e-b40dceea3cfd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581731 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/257bfa27-baf5-4470-84f1-71d06b37f763-stats-auth\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581773 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-trusted-ca\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581801 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-certificates\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581822 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/257bfa27-baf5-4470-84f1-71d06b37f763-default-certificate\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581845 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7d85e7b-038a-4c18-96d6-4aa4c811ff44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2djrp\" (UID: \"b7d85e7b-038a-4c18-96d6-4aa4c811ff44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581866 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce678949-3cb1-499d-a5a4-a80d9337bb3f-serving-cert\") pod \"service-ca-operator-777779d784-sj9r4\" (UID: \"ce678949-3cb1-499d-a5a4-a80d9337bb3f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581891 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce678949-3cb1-499d-a5a4-a80d9337bb3f-config\") pod \"service-ca-operator-777779d784-sj9r4\" (UID: \"ce678949-3cb1-499d-a5a4-a80d9337bb3f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581926 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/257bfa27-baf5-4470-84f1-71d06b37f763-metrics-certs\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581949 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfmrx\" (UniqueName: \"kubernetes.io/projected/7aaedcf1-1390-4a48-be10-0f34ad7f0082-kube-api-access-rfmrx\") pod \"ingress-canary-xrdh9\" (UID: \"7aaedcf1-1390-4a48-be10-0f34ad7f0082\") " pod="openshift-ingress-canary/ingress-canary-xrdh9" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581971 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfhqx\" (UniqueName: \"kubernetes.io/projected/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-kube-api-access-sfhqx\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.581993 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6743e7ca-f9d2-4267-ad68-8a35030cb43c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-77sxr\" (UID: \"6743e7ca-f9d2-4267-ad68-8a35030cb43c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.582014 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-config\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: E1209 03:14:26.588827 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.088802435 +0000 UTC m=+148.798108041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.602694 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.610047 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"52f2f9288fdc1c95bff2e9468f03ca55c28dfdde1c033d5ffc3b8bb88c80264c"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.610102 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"181213ac15e9c225670309656adac980b41c34236d536b617e3dfa094d302b52"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.610686 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.620852 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" event={"ID":"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274","Type":"ContainerStarted","Data":"449baa26dfe002c028eaa535626f801910a60e2e43904696b031fc3c28343716"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.620904 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" event={"ID":"cc673bbc-4d96-40c5-a0ff-eeb9adfaa274","Type":"ContainerStarted","Data":"5a58a97e70cf35f2f7ae0d823c6049954f19794e2c955568c6bde4573c70a6bb"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.627057 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.637780 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.661382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" event={"ID":"56991f65-9178-42a0-ba48-3a53256cd715","Type":"ContainerStarted","Data":"13106924a02d5c658f7f357e3d2cbe2ba237fec89d069758d880d321db22b200"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.661453 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" event={"ID":"56991f65-9178-42a0-ba48-3a53256cd715","Type":"ContainerStarted","Data":"3d6b0197a52ecbc9afffd896f2e6d5a3abd274fd32374b010f4b67c7a838cadb"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.661468 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" event={"ID":"56991f65-9178-42a0-ba48-3a53256cd715","Type":"ContainerStarted","Data":"1e338b8fbb3723076dbcd23fcf1f6769e2d6afbab51dd1018ab9e0f93b0df2fd"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.668101 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.673479 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" event={"ID":"882f22c6-5509-4647-a337-121cca0e1622","Type":"ContainerStarted","Data":"128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.673541 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" event={"ID":"882f22c6-5509-4647-a337-121cca0e1622","Type":"ContainerStarted","Data":"7d93a1cad014c8e16fb6a5a9964397014f7d86040adb5fbe7e5b895e5c821bba"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.673929 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.679518 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4e4da3442a4b157eb62c252889a6de86ccc36a1439857e841b315839714fc3ba"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.682367 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f517c100fba22bfcee5f21d121fb833176c8b899a0efb39f17f4fc45d1754deb"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.682417 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c05b5c1ba2877516b33d56b478e76538bf6883ff8fdbf7833bbbbac211009703"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.682676 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:26 crc kubenswrapper[4766]: E1209 03:14:26.682829 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.182797951 +0000 UTC m=+148.892103377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.682891 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/257bfa27-baf5-4470-84f1-71d06b37f763-default-certificate\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.682917 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7d85e7b-038a-4c18-96d6-4aa4c811ff44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2djrp\" (UID: \"b7d85e7b-038a-4c18-96d6-4aa4c811ff44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.682940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jr9c\" (UniqueName: \"kubernetes.io/projected/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-kube-api-access-2jr9c\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.682962 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce678949-3cb1-499d-a5a4-a80d9337bb3f-serving-cert\") pod \"service-ca-operator-777779d784-sj9r4\" (UID: \"ce678949-3cb1-499d-a5a4-a80d9337bb3f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.682978 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce678949-3cb1-499d-a5a4-a80d9337bb3f-config\") pod \"service-ca-operator-777779d784-sj9r4\" (UID: \"ce678949-3cb1-499d-a5a4-a80d9337bb3f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.682995 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1d2864cb-8f6d-4ee1-a329-ec4d70ed469d-certs\") pod \"machine-config-server-t4ptb\" (UID: \"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d\") " pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.683078 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/257bfa27-baf5-4470-84f1-71d06b37f763-metrics-certs\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.683097 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nlmt\" (UniqueName: \"kubernetes.io/projected/1d2864cb-8f6d-4ee1-a329-ec4d70ed469d-kube-api-access-6nlmt\") pod \"machine-config-server-t4ptb\" (UID: \"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d\") " pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.683119 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-registration-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.683185 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfmrx\" (UniqueName: \"kubernetes.io/projected/7aaedcf1-1390-4a48-be10-0f34ad7f0082-kube-api-access-rfmrx\") pod \"ingress-canary-xrdh9\" (UID: \"7aaedcf1-1390-4a48-be10-0f34ad7f0082\") " pod="openshift-ingress-canary/ingress-canary-xrdh9" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.683232 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwwg\" (UniqueName: \"kubernetes.io/projected/3fe6598b-ddd1-4382-b40a-2389a26958db-kube-api-access-wpwwg\") pod \"dns-default-brcxt\" (UID: \"3fe6598b-ddd1-4382-b40a-2389a26958db\") " pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.683271 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfhqx\" (UniqueName: \"kubernetes.io/projected/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-kube-api-access-sfhqx\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.683336 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6743e7ca-f9d2-4267-ad68-8a35030cb43c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-77sxr\" (UID: \"6743e7ca-f9d2-4267-ad68-8a35030cb43c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.683392 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-plugins-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.683433 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-config\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.686880 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce678949-3cb1-499d-a5a4-a80d9337bb3f-config\") pod \"service-ca-operator-777779d784-sj9r4\" (UID: \"ce678949-3cb1-499d-a5a4-a80d9337bb3f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.687175 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3fe6598b-ddd1-4382-b40a-2389a26958db-metrics-tls\") pod \"dns-default-brcxt\" (UID: \"3fe6598b-ddd1-4382-b40a-2389a26958db\") " pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.687605 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6743e7ca-f9d2-4267-ad68-8a35030cb43c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-77sxr\" (UID: \"6743e7ca-f9d2-4267-ad68-8a35030cb43c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.688057 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36b72bd7-a396-448c-a04e-7f10fdf20d03-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9wx6d\" (UID: \"36b72bd7-a396-448c-a04e-7f10fdf20d03\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.688106 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-etcd-client\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.688707 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-mountpoint-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.688813 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-secret-volume\") pod \"collect-profiles-29420820-4thjl\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.689302 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-config\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.689496 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-config-volume\") pod \"collect-profiles-29420820-4thjl\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.689587 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltllc\" (UniqueName: \"kubernetes.io/projected/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-kube-api-access-ltllc\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.689709 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxdr\" (UniqueName: \"kubernetes.io/projected/257bfa27-baf5-4470-84f1-71d06b37f763-kube-api-access-chxdr\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.691103 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.692134 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-serving-cert\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.692828 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-etcd-service-ca\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.693309 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcmg4\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-kube-api-access-dcmg4\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.693449 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f55j\" (UniqueName: \"kubernetes.io/projected/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-kube-api-access-8f55j\") pod \"collect-profiles-29420820-4thjl\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695310 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00c0f7fd-e9af-4903-9eda-cb613560325a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-94j6f\" (UID: \"00c0f7fd-e9af-4903-9eda-cb613560325a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695353 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00c0f7fd-e9af-4903-9eda-cb613560325a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-94j6f\" (UID: \"00c0f7fd-e9af-4903-9eda-cb613560325a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695394 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bks9j\" (UniqueName: \"kubernetes.io/projected/b7d85e7b-038a-4c18-96d6-4aa4c811ff44-kube-api-access-bks9j\") pod \"package-server-manager-789f6589d5-2djrp\" (UID: \"b7d85e7b-038a-4c18-96d6-4aa4c811ff44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695416 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/257bfa27-baf5-4470-84f1-71d06b37f763-service-ca-bundle\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695438 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-proxy-tls\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695518 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxqmx\" (UniqueName: \"kubernetes.io/projected/ce678949-3cb1-499d-a5a4-a80d9337bb3f-kube-api-access-gxqmx\") pod \"service-ca-operator-777779d784-sj9r4\" (UID: \"ce678949-3cb1-499d-a5a4-a80d9337bb3f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695540 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-etcd-ca\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-bound-sa-token\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695581 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-socket-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695552 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695623 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hkwp\" (UniqueName: \"kubernetes.io/projected/6743e7ca-f9d2-4267-ad68-8a35030cb43c-kube-api-access-6hkwp\") pod \"machine-config-controller-84d6567774-77sxr\" (UID: \"6743e7ca-f9d2-4267-ad68-8a35030cb43c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695788 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a702e81-477c-44a0-909b-520ddc169ca8-metrics-tls\") pod \"dns-operator-744455d44c-c75f2\" (UID: \"6a702e81-477c-44a0-909b-520ddc169ca8\") " pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.693668 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-config-volume\") pod \"collect-profiles-29420820-4thjl\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.695965 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e81ed6-e457-4e59-a25e-b40dceea3cfd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.696083 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9a28313-282c-4ab6-a49f-e68fee577ba7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-np46v\" (UID: \"c9a28313-282c-4ab6-a49f-e68fee577ba7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.696107 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9a28313-282c-4ab6-a49f-e68fee577ba7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-np46v\" (UID: \"c9a28313-282c-4ab6-a49f-e68fee577ba7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.694480 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-etcd-service-ca\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.699870 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/257bfa27-baf5-4470-84f1-71d06b37f763-service-ca-bundle\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.700931 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e81ed6-e457-4e59-a25e-b40dceea3cfd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.704348 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-images\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.704390 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8bk2\" (UniqueName: \"kubernetes.io/projected/6a702e81-477c-44a0-909b-520ddc169ca8-kube-api-access-z8bk2\") pod \"dns-operator-744455d44c-c75f2\" (UID: \"6a702e81-477c-44a0-909b-520ddc169ca8\") " pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.704443 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.704602 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1d2864cb-8f6d-4ee1-a329-ec4d70ed469d-node-bootstrap-token\") pod \"machine-config-server-t4ptb\" (UID: \"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d\") " pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.704659 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpf24\" (UniqueName: \"kubernetes.io/projected/36b72bd7-a396-448c-a04e-7f10fdf20d03-kube-api-access-gpf24\") pod \"multus-admission-controller-857f4d67dd-9wx6d\" (UID: \"36b72bd7-a396-448c-a04e-7f10fdf20d03\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.704699 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c0f7fd-e9af-4903-9eda-cb613560325a-config\") pod \"kube-controller-manager-operator-78b949d7b-94j6f\" (UID: \"00c0f7fd-e9af-4903-9eda-cb613560325a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.704834 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6743e7ca-f9d2-4267-ad68-8a35030cb43c-proxy-tls\") pod \"machine-config-controller-84d6567774-77sxr\" (UID: \"6743e7ca-f9d2-4267-ad68-8a35030cb43c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.704884 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a28313-282c-4ab6-a49f-e68fee577ba7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-np46v\" (UID: \"c9a28313-282c-4ab6-a49f-e68fee577ba7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.704904 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7aaedcf1-1390-4a48-be10-0f34ad7f0082-cert\") pod \"ingress-canary-xrdh9\" (UID: \"7aaedcf1-1390-4a48-be10-0f34ad7f0082\") " pod="openshift-ingress-canary/ingress-canary-xrdh9" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.705040 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3fe6598b-ddd1-4382-b40a-2389a26958db-config-volume\") pod \"dns-default-brcxt\" (UID: \"3fe6598b-ddd1-4382-b40a-2389a26958db\") " pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.705082 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-tls\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.705107 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e81ed6-e457-4e59-a25e-b40dceea3cfd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.705127 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-csi-data-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.705166 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/257bfa27-baf5-4470-84f1-71d06b37f763-stats-auth\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.706015 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-trusted-ca\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.706072 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-certificates\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.707865 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-certificates\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.708573 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c0f7fd-e9af-4903-9eda-cb613560325a-config\") pod \"kube-controller-manager-operator-78b949d7b-94j6f\" (UID: \"00c0f7fd-e9af-4903-9eda-cb613560325a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.709122 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" event={"ID":"21953e96-95ef-438b-a25a-e70f7ad6f7be","Type":"ContainerStarted","Data":"2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.709838 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-etcd-ca\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.710244 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9a28313-282c-4ab6-a49f-e68fee577ba7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-np46v\" (UID: \"c9a28313-282c-4ab6-a49f-e68fee577ba7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.710723 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-images\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: E1209 03:14:26.711072 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.211049759 +0000 UTC m=+148.920355185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.712254 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-trusted-ca\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.713264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" event={"ID":"e870077d-3b39-484c-a3c7-3b3fdd81e92e","Type":"ContainerStarted","Data":"aed64de13b8b43a41929f17c5c0d0fc08c47d949fb7230839c5b68ec827d07a2"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.713328 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.715475 4766 generic.go:334] "Generic (PLEG): container finished" podID="c633b9b3-8368-4de4-94c9-a1220ec6f07e" containerID="58b9420f2a038404df58d8b9c953b4c6624987e4f695d112d2c65bce2a96089b" exitCode=0 Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.715568 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" event={"ID":"c633b9b3-8368-4de4-94c9-a1220ec6f07e","Type":"ContainerDied","Data":"58b9420f2a038404df58d8b9c953b4c6624987e4f695d112d2c65bce2a96089b"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.715611 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" event={"ID":"c633b9b3-8368-4de4-94c9-a1220ec6f07e","Type":"ContainerStarted","Data":"fb6d329a8909fdd1698eed942af5270d31bbde08d1453e47461cdee93cb98193"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.716426 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9a28313-282c-4ab6-a49f-e68fee577ba7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-np46v\" (UID: \"c9a28313-282c-4ab6-a49f-e68fee577ba7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.721331 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6743e7ca-f9d2-4267-ad68-8a35030cb43c-proxy-tls\") pod \"machine-config-controller-84d6567774-77sxr\" (UID: \"6743e7ca-f9d2-4267-ad68-8a35030cb43c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.724856 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce678949-3cb1-499d-a5a4-a80d9337bb3f-serving-cert\") pod \"service-ca-operator-777779d784-sj9r4\" (UID: \"ce678949-3cb1-499d-a5a4-a80d9337bb3f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.724887 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e81ed6-e457-4e59-a25e-b40dceea3cfd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.724934 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-secret-volume\") pod \"collect-profiles-29420820-4thjl\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.724884 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-serving-cert\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.725434 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36b72bd7-a396-448c-a04e-7f10fdf20d03-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9wx6d\" (UID: \"36b72bd7-a396-448c-a04e-7f10fdf20d03\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.725604 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00c0f7fd-e9af-4903-9eda-cb613560325a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-94j6f\" (UID: \"00c0f7fd-e9af-4903-9eda-cb613560325a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.725827 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-tls\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.726874 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" event={"ID":"2f2facf5-7977-44e9-beea-141276d212a5","Type":"ContainerStarted","Data":"464d3f552bf4e3b2c2b7119d46c4b66a54935d662874ad750e3ca9fe8956410a"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.726920 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" event={"ID":"2f2facf5-7977-44e9-beea-141276d212a5","Type":"ContainerStarted","Data":"60199c4b1dd0ad177ebadd1f95967d79953ec307758b6bce22b989f98b55889d"} Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.727620 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.728362 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-etcd-client\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.731007 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/257bfa27-baf5-4470-84f1-71d06b37f763-default-certificate\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.731458 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7d85e7b-038a-4c18-96d6-4aa4c811ff44-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2djrp\" (UID: \"b7d85e7b-038a-4c18-96d6-4aa4c811ff44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.733972 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/257bfa27-baf5-4470-84f1-71d06b37f763-stats-auth\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.734632 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfmrx\" (UniqueName: \"kubernetes.io/projected/7aaedcf1-1390-4a48-be10-0f34ad7f0082-kube-api-access-rfmrx\") pod \"ingress-canary-xrdh9\" (UID: \"7aaedcf1-1390-4a48-be10-0f34ad7f0082\") " pod="openshift-ingress-canary/ingress-canary-xrdh9" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.735065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/257bfa27-baf5-4470-84f1-71d06b37f763-metrics-certs\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.735134 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7aaedcf1-1390-4a48-be10-0f34ad7f0082-cert\") pod \"ingress-canary-xrdh9\" (UID: \"7aaedcf1-1390-4a48-be10-0f34ad7f0082\") " pod="openshift-ingress-canary/ingress-canary-xrdh9" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.743552 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a702e81-477c-44a0-909b-520ddc169ca8-metrics-tls\") pod \"dns-operator-744455d44c-c75f2\" (UID: \"6a702e81-477c-44a0-909b-520ddc169ca8\") " pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.743622 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-proxy-tls\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.752179 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.754641 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.758132 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfhqx\" (UniqueName: \"kubernetes.io/projected/d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3-kube-api-access-sfhqx\") pod \"machine-config-operator-74547568cd-jpckj\" (UID: \"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.781056 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ss24g"] Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.789784 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltllc\" (UniqueName: \"kubernetes.io/projected/dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f-kube-api-access-ltllc\") pod \"etcd-operator-b45778765-6pzhg\" (UID: \"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.808957 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809156 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-plugins-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809429 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3fe6598b-ddd1-4382-b40a-2389a26958db-metrics-tls\") pod \"dns-default-brcxt\" (UID: \"3fe6598b-ddd1-4382-b40a-2389a26958db\") " pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809452 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-mountpoint-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809606 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-socket-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809669 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1d2864cb-8f6d-4ee1-a329-ec4d70ed469d-node-bootstrap-token\") pod \"machine-config-server-t4ptb\" (UID: \"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d\") " pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809710 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3fe6598b-ddd1-4382-b40a-2389a26958db-config-volume\") pod \"dns-default-brcxt\" (UID: \"3fe6598b-ddd1-4382-b40a-2389a26958db\") " pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809739 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-csi-data-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809770 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jr9c\" (UniqueName: \"kubernetes.io/projected/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-kube-api-access-2jr9c\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809786 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1d2864cb-8f6d-4ee1-a329-ec4d70ed469d-certs\") pod \"machine-config-server-t4ptb\" (UID: \"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d\") " pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809815 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nlmt\" (UniqueName: \"kubernetes.io/projected/1d2864cb-8f6d-4ee1-a329-ec4d70ed469d-kube-api-access-6nlmt\") pod \"machine-config-server-t4ptb\" (UID: \"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d\") " pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809830 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-registration-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.809847 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwwg\" (UniqueName: \"kubernetes.io/projected/3fe6598b-ddd1-4382-b40a-2389a26958db-kube-api-access-wpwwg\") pod \"dns-default-brcxt\" (UID: \"3fe6598b-ddd1-4382-b40a-2389a26958db\") " pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.811727 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-socket-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: E1209 03:14:26.811974 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.311940422 +0000 UTC m=+149.021245848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.812045 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-plugins-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.816010 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-csi-data-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.817229 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-mountpoint-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.818044 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3fe6598b-ddd1-4382-b40a-2389a26958db-config-volume\") pod \"dns-default-brcxt\" (UID: \"3fe6598b-ddd1-4382-b40a-2389a26958db\") " pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.819776 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-registration-dir\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.824240 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3fe6598b-ddd1-4382-b40a-2389a26958db-metrics-tls\") pod \"dns-default-brcxt\" (UID: \"3fe6598b-ddd1-4382-b40a-2389a26958db\") " pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.824897 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hkwp\" (UniqueName: \"kubernetes.io/projected/6743e7ca-f9d2-4267-ad68-8a35030cb43c-kube-api-access-6hkwp\") pod \"machine-config-controller-84d6567774-77sxr\" (UID: \"6743e7ca-f9d2-4267-ad68-8a35030cb43c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.829151 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxdr\" (UniqueName: \"kubernetes.io/projected/257bfa27-baf5-4470-84f1-71d06b37f763-kube-api-access-chxdr\") pod \"router-default-5444994796-8f855\" (UID: \"257bfa27-baf5-4470-84f1-71d06b37f763\") " pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.832344 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1d2864cb-8f6d-4ee1-a329-ec4d70ed469d-certs\") pod \"machine-config-server-t4ptb\" (UID: \"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d\") " pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.834100 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1d2864cb-8f6d-4ee1-a329-ec4d70ed469d-node-bootstrap-token\") pod \"machine-config-server-t4ptb\" (UID: \"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d\") " pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.853862 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcmg4\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-kube-api-access-dcmg4\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.869494 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f55j\" (UniqueName: \"kubernetes.io/projected/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-kube-api-access-8f55j\") pod \"collect-profiles-29420820-4thjl\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.874241 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.888266 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.897695 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.927032 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l"] Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.904945 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bks9j\" (UniqueName: \"kubernetes.io/projected/b7d85e7b-038a-4c18-96d6-4aa4c811ff44-kube-api-access-bks9j\") pod \"package-server-manager-789f6589d5-2djrp\" (UID: \"b7d85e7b-038a-4c18-96d6-4aa4c811ff44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.912536 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:26 crc kubenswrapper[4766]: E1209 03:14:26.912899 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.412881266 +0000 UTC m=+149.122186692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.900260 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00c0f7fd-e9af-4903-9eda-cb613560325a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-94j6f\" (UID: \"00c0f7fd-e9af-4903-9eda-cb613560325a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.933483 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fpj7h"] Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.951103 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.956280 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.957644 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxqmx\" (UniqueName: \"kubernetes.io/projected/ce678949-3cb1-499d-a5a4-a80d9337bb3f-kube-api-access-gxqmx\") pod \"service-ca-operator-777779d784-sj9r4\" (UID: \"ce678949-3cb1-499d-a5a4-a80d9337bb3f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.974439 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9a28313-282c-4ab6-a49f-e68fee577ba7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-np46v\" (UID: \"c9a28313-282c-4ab6-a49f-e68fee577ba7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:26 crc kubenswrapper[4766]: I1209 03:14:26.989619 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.001696 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpf24\" (UniqueName: \"kubernetes.io/projected/36b72bd7-a396-448c-a04e-7f10fdf20d03-kube-api-access-gpf24\") pod \"multus-admission-controller-857f4d67dd-9wx6d\" (UID: \"36b72bd7-a396-448c-a04e-7f10fdf20d03\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.022476 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.023154 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.037359 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.037643 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.537603058 +0000 UTC m=+149.246908484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.038151 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xrdh9" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.040722 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.041174 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.541165195 +0000 UTC m=+149.250470621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.095312 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8bk2\" (UniqueName: \"kubernetes.io/projected/6a702e81-477c-44a0-909b-520ddc169ca8-kube-api-access-z8bk2\") pod \"dns-operator-744455d44c-c75f2\" (UID: \"6a702e81-477c-44a0-909b-520ddc169ca8\") " pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.099968 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwwg\" (UniqueName: \"kubernetes.io/projected/3fe6598b-ddd1-4382-b40a-2389a26958db-kube-api-access-wpwwg\") pod \"dns-default-brcxt\" (UID: \"3fe6598b-ddd1-4382-b40a-2389a26958db\") " pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.100281 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nlmt\" (UniqueName: \"kubernetes.io/projected/1d2864cb-8f6d-4ee1-a329-ec4d70ed469d-kube-api-access-6nlmt\") pod \"machine-config-server-t4ptb\" (UID: \"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d\") " pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.112918 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-bound-sa-token\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.121564 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jr9c\" (UniqueName: \"kubernetes.io/projected/ed5281e5-9357-4e0d-bb0f-ab3a1b177bed-kube-api-access-2jr9c\") pod \"csi-hostpathplugin-82txl\" (UID: \"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed\") " pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.124506 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tfx22"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.142426 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.142701 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.642677265 +0000 UTC m=+149.351982681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.148926 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vfvvg"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.159626 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.166613 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.173568 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.180692 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.220028 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.224090 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.233672 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.248225 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.248757 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.748740309 +0000 UTC m=+149.458045735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.353461 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.353749 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.853733304 +0000 UTC m=+149.563038730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.356734 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-82txl" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.365573 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.376945 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-t4ptb" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.389904 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swbbr"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.456613 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.456946 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:27.956934219 +0000 UTC m=+149.666239645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.460893 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.512802 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-55tcm" podStartSLOduration=124.512779678 podStartE2EDuration="2m4.512779678s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:27.500752191 +0000 UTC m=+149.210057617" watchObservedRunningTime="2025-12-09 03:14:27.512779678 +0000 UTC m=+149.222085104" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.559168 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.559310 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.059288773 +0000 UTC m=+149.768594199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.559595 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.559998 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.059991012 +0000 UTC m=+149.769296438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.663023 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.663406 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.163383753 +0000 UTC m=+149.872689179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.766993 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.767778 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.267750691 +0000 UTC m=+149.977056117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.806478 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.806778 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.855169 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.860594 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp"] Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.862846 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" event={"ID":"c633b9b3-8368-4de4-94c9-a1220ec6f07e","Type":"ContainerStarted","Data":"8f9a0d4d882fd8db4119fe9eceac5115649a3c388fa16e97d67f2e4384fb2166"} Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.875026 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p" event={"ID":"07235424-e14e-42c1-9508-746e61b3a531","Type":"ContainerStarted","Data":"2ea6dcc733bd0668b30d27a63a96d84c82fd8757126aa7fbd8a0e80909025f2c"} Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.876889 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.877423 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.377399213 +0000 UTC m=+150.086704639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.883125 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" event={"ID":"48920893-23b6-4a46-9d8b-207acc99c16f","Type":"ContainerStarted","Data":"95936a048b98fb46e90ceac8ef0a5207f4b8db5c17445a367515291fc3c14372"} Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.901276 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vfvvg" event={"ID":"9577de09-1a6c-4b95-9d9f-225ad2e6be36","Type":"ContainerStarted","Data":"3be954a817a4005c7af0e69a7b43cd6ceb189c28eeb4e3fe23409a110205f4e4"} Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.936996 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" event={"ID":"6c839bfc-fc6c-4a74-a879-c9d24b335eb5","Type":"ContainerStarted","Data":"fbfb6edca5a3c6187ad68451f57d51514c8b8ede687d008da0d2663915dd2baa"} Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.937554 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" event={"ID":"6c839bfc-fc6c-4a74-a879-c9d24b335eb5","Type":"ContainerStarted","Data":"44fbe0e51869e4b55f91b8dfec24ebd1c1caaf90586ac1bcfac0c4338cb4353d"} Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.960408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8f855" event={"ID":"257bfa27-baf5-4470-84f1-71d06b37f763","Type":"ContainerStarted","Data":"13e3224e97be46d25b32bdeb4d158994f07cab7536d48387a3f56d4541c34dbd"} Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.962512 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" podStartSLOduration=124.962487806 podStartE2EDuration="2m4.962487806s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:27.920829743 +0000 UTC m=+149.630135169" watchObservedRunningTime="2025-12-09 03:14:27.962487806 +0000 UTC m=+149.671793252" Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.965112 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" event={"ID":"b36ce791-255f-4a9e-9f2a-f1dc62b13892","Type":"ContainerStarted","Data":"b3081ef97952a4321acd0eb85fbedb4276187d18e7fc5acd2b2f815b34752d45"} Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.978317 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:27 crc kubenswrapper[4766]: E1209 03:14:27.980481 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.480452164 +0000 UTC m=+150.189757580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.982298 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ss24g" event={"ID":"34120810-df87-4443-a2f7-16982e46027d","Type":"ContainerStarted","Data":"6dc7a56cf53232cb0fd90304c0e7e995dbc2f617f57ef780c5809ec194e2776a"} Dec 09 03:14:27 crc kubenswrapper[4766]: I1209 03:14:27.993305 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" event={"ID":"2227c3d3-0ba1-4f48-969e-6572a2ef618e","Type":"ContainerStarted","Data":"2daf03fbf8e06d1731c9206a8516b14c29e795970a0f78157192b6afd2b7cea6"} Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.004093 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" event={"ID":"a0ba986e-97eb-4b03-ac68-e959e127efe6","Type":"ContainerStarted","Data":"b55ffd841c891925615b73d82b981c008cda941f233dcec8f8a2850019d62e7b"} Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.051095 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" podStartSLOduration=125.051071285 podStartE2EDuration="2m5.051071285s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:28.004850678 +0000 UTC m=+149.714156114" watchObservedRunningTime="2025-12-09 03:14:28.051071285 +0000 UTC m=+149.760376711" Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.069811 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp"] Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.073576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" event={"ID":"f839f7fc-1e7f-4c71-9c02-f456ffacb094","Type":"ContainerStarted","Data":"e81a896603ec654897241fb82280ae101b599b6642dab0dcfbb32453140140ec"} Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.079271 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4p5j6"] Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.080345 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.081804 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.58178116 +0000 UTC m=+150.291086586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.083129 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2"] Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.085113 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr"] Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.104054 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" event={"ID":"b1d44c09-a0e8-43de-8405-b0cf0490a2b6","Type":"ContainerStarted","Data":"37cdbbf0f58ef068c1656540c035b828d3be37dff3716aa0701d307d197ee453"} Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.106704 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2fbhb" podStartSLOduration=125.106690027 podStartE2EDuration="2m5.106690027s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:28.104500088 +0000 UTC m=+149.813805514" watchObservedRunningTime="2025-12-09 03:14:28.106690027 +0000 UTC m=+149.815995453" Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.144620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fpj7h" event={"ID":"73cae1ca-ac0e-4f90-92ec-d4077800d063","Type":"ContainerStarted","Data":"1fb25c53f0e3a0ac0a232f8bf831e40437e2bc3eff0cd6be07e4bb53b28e9c5d"} Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.171934 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" event={"ID":"d2483b49-abc4-447a-9896-90574a532a13","Type":"ContainerStarted","Data":"3001863e51f5f8a0b4364bf15156216a0ddf9828b44bdf99dc9bbfe812ed45cb"} Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.189381 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.192807 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.692786878 +0000 UTC m=+150.402092304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.291464 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.291992 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.791950095 +0000 UTC m=+150.501255521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.293384 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.294244 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.794194045 +0000 UTC m=+150.503499481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.389305 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xrdh9"] Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.395025 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.395982 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.895957363 +0000 UTC m=+150.605262789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.461867 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6pzhg"] Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.496374 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.496806 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:28.996791564 +0000 UTC m=+150.706096990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.516164 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp"] Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.568288 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" podStartSLOduration=125.568262257 podStartE2EDuration="2m5.568262257s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:28.535755384 +0000 UTC m=+150.245060810" watchObservedRunningTime="2025-12-09 03:14:28.568262257 +0000 UTC m=+150.277567683" Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.599995 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.600893 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.100867384 +0000 UTC m=+150.810172810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.607484 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fbnmb" podStartSLOduration=125.607444743 podStartE2EDuration="2m5.607444743s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:28.569636705 +0000 UTC m=+150.278942131" watchObservedRunningTime="2025-12-09 03:14:28.607444743 +0000 UTC m=+150.316750169" Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.702501 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.703344 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.203321779 +0000 UTC m=+150.912627205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: W1209 03:14:28.733338 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aaedcf1_1390_4a48_be10_0f34ad7f0082.slice/crio-6cd3475aab103fe91d35e2b1e6a2e44239d739d29307acb80bef5edd8a820f16 WatchSource:0}: Error finding container 6cd3475aab103fe91d35e2b1e6a2e44239d739d29307acb80bef5edd8a820f16: Status 404 returned error can't find the container with id 6cd3475aab103fe91d35e2b1e6a2e44239d739d29307acb80bef5edd8a820f16 Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.807056 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.808031 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.307981896 +0000 UTC m=+151.017287322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.808462 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.809237 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.309197488 +0000 UTC m=+151.018502914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.912609 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:28 crc kubenswrapper[4766]: E1209 03:14:28.913077 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.413055982 +0000 UTC m=+151.122361418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.921109 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4"] Dec 09 03:14:28 crc kubenswrapper[4766]: I1209 03:14:28.943154 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl"] Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:28.993818 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj"] Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.015309 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.015829 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.515813247 +0000 UTC m=+151.225118673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:29 crc kubenswrapper[4766]: W1209 03:14:29.057536 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb57e3485_ffb5_46c1_b8d6_2e0296b85fab.slice/crio-b8b692e59d2abaa4c2a28c89274e3ebf53d4ce08a8bf96a533dc40fb2547302f WatchSource:0}: Error finding container b8b692e59d2abaa4c2a28c89274e3ebf53d4ce08a8bf96a533dc40fb2547302f: Status 404 returned error can't find the container with id b8b692e59d2abaa4c2a28c89274e3ebf53d4ce08a8bf96a533dc40fb2547302f Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.084350 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" podStartSLOduration=126.084318339 podStartE2EDuration="2m6.084318339s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:29.079743135 +0000 UTC m=+150.789048581" watchObservedRunningTime="2025-12-09 03:14:29.084318339 +0000 UTC m=+150.793623765" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.118102 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.119246 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.619204458 +0000 UTC m=+151.328509894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:29 crc kubenswrapper[4766]: W1209 03:14:29.125895 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0fdcd8a_ec05_47e3_8509_ec4c57a3c5e3.slice/crio-6878b0add9cc34402b366e34f5d53ace312d94c1c9b74738e87a6a783c158d98 WatchSource:0}: Error finding container 6878b0add9cc34402b366e34f5d53ace312d94c1c9b74738e87a6a783c158d98: Status 404 returned error can't find the container with id 6878b0add9cc34402b366e34f5d53ace312d94c1c9b74738e87a6a783c158d98 Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.191842 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c75f2"] Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.220126 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.220481 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.720469501 +0000 UTC m=+151.429774927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.272340 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" event={"ID":"14f5010c-daa1-4ded-ab26-58e171159446","Type":"ContainerStarted","Data":"fee08c9ea4b7a207769f901f1aa71053fe2a5d1b3aea6946090fd8dd1bc42437"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.310675 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fpj7h" event={"ID":"73cae1ca-ac0e-4f90-92ec-d4077800d063","Type":"ContainerStarted","Data":"f69f548d1736757a2f57863f9fdfd7caf0497a48cb6d946d86c4e44d781698b8"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.313629 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-fpj7h" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.321494 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.323407 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-fpj7h container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.323485 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fpj7h" podUID="73cae1ca-ac0e-4f90-92ec-d4077800d063" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.323670 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.823646357 +0000 UTC m=+151.532951783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.325601 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f"] Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.351070 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-82txl"] Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.373593 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xrdh9" event={"ID":"7aaedcf1-1390-4a48-be10-0f34ad7f0082","Type":"ContainerStarted","Data":"6cd3475aab103fe91d35e2b1e6a2e44239d739d29307acb80bef5edd8a820f16"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.376571 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ss24g" podStartSLOduration=126.376551695 podStartE2EDuration="2m6.376551695s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:29.349081998 +0000 UTC m=+151.058387424" watchObservedRunningTime="2025-12-09 03:14:29.376551695 +0000 UTC m=+151.085857121" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.378614 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v"] Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.414868 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" event={"ID":"d0581951-bbb8-4550-8ced-7c1431f6deb8","Type":"ContainerStarted","Data":"e8d8a02c74f12146b3c99990b95dc39374358e6e28eb655bac8f275f05b1ffb9"} Dec 09 03:14:29 crc kubenswrapper[4766]: W1209 03:14:29.415329 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a28313_282c_4ab6_a49f_e68fee577ba7.slice/crio-e7cfe8fc938060537ba0127a15cfa668427a160a6a9178d9fdc623e1d8d5a20f WatchSource:0}: Error finding container e7cfe8fc938060537ba0127a15cfa668427a160a6a9178d9fdc623e1d8d5a20f: Status 404 returned error can't find the container with id e7cfe8fc938060537ba0127a15cfa668427a160a6a9178d9fdc623e1d8d5a20f Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.423547 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.423950 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:29.923934994 +0000 UTC m=+151.633240420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.434615 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ss24g" event={"ID":"34120810-df87-4443-a2f7-16982e46027d","Type":"ContainerStarted","Data":"3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.467428 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" event={"ID":"e2bcd744-9d24-446e-b445-b32b262e3093","Type":"ContainerStarted","Data":"fba9df9c3a774e99a18f10d3d5f88a13aa477f54aec0788e5ac99903dfe963ba"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.482067 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-brcxt"] Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.486403 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" event={"ID":"b36ce791-255f-4a9e-9f2a-f1dc62b13892","Type":"ContainerStarted","Data":"62ffe5bd7476bfa74e530c6e87de78d3aab64a07ef67d0ac287dc3bf1445e186"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.497164 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9wx6d"] Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.504691 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" event={"ID":"0af4a108-cb4c-4537-859d-5ca0874be360","Type":"ContainerStarted","Data":"ae675c8859f95e6785c88c06ad1d845b77c42bed46596375a8ba00066c530155"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.505968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" event={"ID":"f12f58f9-e382-49ce-843a-c9da746a5b80","Type":"ContainerStarted","Data":"3855fc106019c047722eebe9de8c2c2725b86daff16e03ae202ba17e2b165b17"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.507265 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p" event={"ID":"07235424-e14e-42c1-9508-746e61b3a531","Type":"ContainerStarted","Data":"04b5fee937db6f5fb3a83a34b2e6199cd8ee9d880c8bf3584ccf7e1372fac955"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.508418 4766 generic.go:334] "Generic (PLEG): container finished" podID="6c839bfc-fc6c-4a74-a879-c9d24b335eb5" containerID="fbfb6edca5a3c6187ad68451f57d51514c8b8ede687d008da0d2663915dd2baa" exitCode=0 Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.508459 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" event={"ID":"6c839bfc-fc6c-4a74-a879-c9d24b335eb5","Type":"ContainerDied","Data":"fbfb6edca5a3c6187ad68451f57d51514c8b8ede687d008da0d2663915dd2baa"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.514589 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" event={"ID":"b57e3485-ffb5-46c1-b8d6-2e0296b85fab","Type":"ContainerStarted","Data":"b8b692e59d2abaa4c2a28c89274e3ebf53d4ce08a8bf96a533dc40fb2547302f"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.531992 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.532844 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.032824664 +0000 UTC m=+151.742130090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.536783 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" event={"ID":"d2483b49-abc4-447a-9896-90574a532a13","Type":"ContainerStarted","Data":"e64ee5bf162d29d161cadb40a681b42648b1e07973d0dc11c137511ca5e83f2e"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.545828 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" event={"ID":"692e07fb-1bec-44f3-8112-4c5d496b7b4a","Type":"ContainerStarted","Data":"c295add4c7fe678a561f6c59aef86667db04192281befccf6bfb1960f55232a4"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.555721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" event={"ID":"ce678949-3cb1-499d-a5a4-a80d9337bb3f","Type":"ContainerStarted","Data":"79dd22483b13d5a59bb2014944565a954b4fd3883239eaa67b59d12010879667"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.560608 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" event={"ID":"b1d44c09-a0e8-43de-8405-b0cf0490a2b6","Type":"ContainerStarted","Data":"378f90fca615e087cb7cfb6844a08c62819e2671fc59f8945ea4fb7b40e94a69"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.561861 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.568197 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" event={"ID":"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3","Type":"ContainerStarted","Data":"6878b0add9cc34402b366e34f5d53ace312d94c1c9b74738e87a6a783c158d98"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.571794 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-tfx22" podStartSLOduration=126.571768513 podStartE2EDuration="2m6.571768513s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:29.568580426 +0000 UTC m=+151.277885842" watchObservedRunningTime="2025-12-09 03:14:29.571768513 +0000 UTC m=+151.281073939" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.575921 4766 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cfqbb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.575967 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" podUID="b1d44c09-a0e8-43de-8405-b0cf0490a2b6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.576226 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" event={"ID":"6743e7ca-f9d2-4267-ad68-8a35030cb43c","Type":"ContainerStarted","Data":"04c6bb41d7508eb8afe5f5b03919cfb277ba4cd3374a5e123852a3ff841f94ef"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.583203 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" event={"ID":"46928ed3-a9cb-4bcf-8647-237ef0418cb8","Type":"ContainerStarted","Data":"a800a1a955f5db568c100eb762b80bdc88eb452bdf2a6eda5d11256e233d042e"} Dec 09 03:14:29 crc kubenswrapper[4766]: W1209 03:14:29.599914 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36b72bd7_a396_448c_a04e_7f10fdf20d03.slice/crio-7f65df292c7f3beec31eaf68ba388a3e215abf76894a6eb2993c43ab3c80305d WatchSource:0}: Error finding container 7f65df292c7f3beec31eaf68ba388a3e215abf76894a6eb2993c43ab3c80305d: Status 404 returned error can't find the container with id 7f65df292c7f3beec31eaf68ba388a3e215abf76894a6eb2993c43ab3c80305d Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.600830 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8f855" event={"ID":"257bfa27-baf5-4470-84f1-71d06b37f763","Type":"ContainerStarted","Data":"4fde8f0b08bdd466d28cb9f76700247b99a8e3ccf71b581447a01aaea9688e7a"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.606799 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-fpj7h" podStartSLOduration=126.606784335 podStartE2EDuration="2m6.606784335s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:29.603128146 +0000 UTC m=+151.312433572" watchObservedRunningTime="2025-12-09 03:14:29.606784335 +0000 UTC m=+151.316089761" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.611439 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" event={"ID":"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f","Type":"ContainerStarted","Data":"98e6218d3c02411e0df4e333209d8b3fc0c90bcdc46e02f650db90d8dadfe839"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.628151 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-t4ptb" event={"ID":"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d","Type":"ContainerStarted","Data":"dbf43252393dab754858386941c94637c6047a760565d96a841256312caa6d67"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.631742 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" event={"ID":"b7d85e7b-038a-4c18-96d6-4aa4c811ff44","Type":"ContainerStarted","Data":"7066984e3e92ec18be80e50774ff11e7a4db333e964ec7991ec4a3cfb10c822d"} Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.637355 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.640630 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8f855" podStartSLOduration=126.640600055 podStartE2EDuration="2m6.640600055s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:29.635971779 +0000 UTC m=+151.345277205" watchObservedRunningTime="2025-12-09 03:14:29.640600055 +0000 UTC m=+151.349905501" Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.661820 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.161796232 +0000 UTC m=+151.871101658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.727652 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" podStartSLOduration=126.727621831 podStartE2EDuration="2m6.727621831s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:29.665537233 +0000 UTC m=+151.374842649" watchObservedRunningTime="2025-12-09 03:14:29.727621831 +0000 UTC m=+151.436927257" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.737883 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" podStartSLOduration=126.737856129 podStartE2EDuration="2m6.737856129s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:29.709021705 +0000 UTC m=+151.418327131" watchObservedRunningTime="2025-12-09 03:14:29.737856129 +0000 UTC m=+151.447161555" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.741604 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.742377 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.242352531 +0000 UTC m=+151.951657957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.843467 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.843963 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.343942794 +0000 UTC m=+152.053248220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.879381 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.886076 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:29 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:29 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:29 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.886133 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:29 crc kubenswrapper[4766]: I1209 03:14:29.953743 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:29 crc kubenswrapper[4766]: E1209 03:14:29.955685 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.455661441 +0000 UTC m=+152.164966867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.056510 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.056961 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.556947576 +0000 UTC m=+152.266253002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.157915 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.158455 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.658414575 +0000 UTC m=+152.367720001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.158701 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.159128 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.659111993 +0000 UTC m=+152.368417419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.207697 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.208884 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.259775 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.260584 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.760561221 +0000 UTC m=+152.469866647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.273730 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.295991 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.364559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.367294 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.867271843 +0000 UTC m=+152.576577269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.466198 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.467111 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:30.967091617 +0000 UTC m=+152.676397043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.569383 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:31.069363908 +0000 UTC m=+152.778669334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.568931 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.670773 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.671812 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:31.171791083 +0000 UTC m=+152.881096499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.705814 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-t4ptb" event={"ID":"1d2864cb-8f6d-4ee1-a329-ec4d70ed469d","Type":"ContainerStarted","Data":"57b3945a8c52a9cdaf7c5d735bdf21ce50959550da3012a9b6be7fdc0bdd34e2"} Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.724805 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" event={"ID":"d0581951-bbb8-4550-8ced-7c1431f6deb8","Type":"ContainerStarted","Data":"71dee49c9ffc5131daffe40b07bccfedf9a1e937dfa3c9f74097a771039c0bec"} Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.740436 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-t4ptb" podStartSLOduration=6.74041579 podStartE2EDuration="6.74041579s" podCreationTimestamp="2025-12-09 03:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:30.739039132 +0000 UTC m=+152.448344568" watchObservedRunningTime="2025-12-09 03:14:30.74041579 +0000 UTC m=+152.449721206" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.769718 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" event={"ID":"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3","Type":"ContainerStarted","Data":"48205d86bcb18a689003bc83b0231a7862ce3c8e6bb4e1c05d6de3972fb262ff"} Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.772441 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.774867 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:31.274850686 +0000 UTC m=+152.984156112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.784721 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vt8xz" podStartSLOduration=127.784643551 podStartE2EDuration="2m7.784643551s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:30.782284957 +0000 UTC m=+152.491590383" watchObservedRunningTime="2025-12-09 03:14:30.784643551 +0000 UTC m=+152.493948977" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.785563 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xrdh9" event={"ID":"7aaedcf1-1390-4a48-be10-0f34ad7f0082","Type":"ContainerStarted","Data":"dc2721074cfc61a5e5167a2fb0d465a12360eb0a6c0e7d7e9e73c69440590ea7"} Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.800685 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vfvvg" event={"ID":"9577de09-1a6c-4b95-9d9f-225ad2e6be36","Type":"ContainerStarted","Data":"2ac0a0e9d7957dc302c23fea21acf5dc89c4f19c9be7fb6f2b40a9858b873cc3"} Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.801888 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.815663 4766 patch_prober.go:28] interesting pod/console-operator-58897d9998-vfvvg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.816001 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vfvvg" podUID="9577de09-1a6c-4b95-9d9f-225ad2e6be36" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.831963 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xrdh9" podStartSLOduration=7.831940147 podStartE2EDuration="7.831940147s" podCreationTimestamp="2025-12-09 03:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:30.830603062 +0000 UTC m=+152.539908488" watchObservedRunningTime="2025-12-09 03:14:30.831940147 +0000 UTC m=+152.541245573" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.868397 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" event={"ID":"46928ed3-a9cb-4bcf-8647-237ef0418cb8","Type":"ContainerStarted","Data":"8ac989f15f1efa1f6a485f105c4765c9cfe469e6e9eee9ae678ad304ae9652f4"} Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.875692 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:30 crc kubenswrapper[4766]: E1209 03:14:30.876594 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:31.376577461 +0000 UTC m=+153.085882887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.887977 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:30 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:30 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:30 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.888092 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.889238 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" event={"ID":"692e07fb-1bec-44f3-8112-4c5d496b7b4a","Type":"ContainerStarted","Data":"52d18fd956d2f8965f1b47b627028503751527380e382c0f96209c225506a34c"} Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.909161 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fcbp6" podStartSLOduration=127.909140387 podStartE2EDuration="2m7.909140387s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:30.904833039 +0000 UTC m=+152.614138465" watchObservedRunningTime="2025-12-09 03:14:30.909140387 +0000 UTC m=+152.618445813" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.929282 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vfvvg" podStartSLOduration=127.929238253 podStartE2EDuration="2m7.929238253s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:30.858439359 +0000 UTC m=+152.567744795" watchObservedRunningTime="2025-12-09 03:14:30.929238253 +0000 UTC m=+152.638543709" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.942453 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" event={"ID":"ce678949-3cb1-499d-a5a4-a80d9337bb3f","Type":"ContainerStarted","Data":"2e6f30eedf5777997196e2d06ab8d9b75c3d33fca23bee5a8c32093ff0a70263"} Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.964336 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-phhsp" podStartSLOduration=127.964306366 podStartE2EDuration="2m7.964306366s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:30.9625843 +0000 UTC m=+152.671889716" watchObservedRunningTime="2025-12-09 03:14:30.964306366 +0000 UTC m=+152.673611792" Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.973564 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" event={"ID":"b7d85e7b-038a-4c18-96d6-4aa4c811ff44","Type":"ContainerStarted","Data":"91ff7792acfb05c7ddde71a1ddc47abf96f63d7dc6d56a2f34fd57eed9c28846"} Dec 09 03:14:30 crc kubenswrapper[4766]: I1209 03:14:30.990891 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.002204 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:31.502183117 +0000 UTC m=+153.211488563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.030378 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sj9r4" podStartSLOduration=128.030347092 podStartE2EDuration="2m8.030347092s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:30.998918318 +0000 UTC m=+152.708223754" watchObservedRunningTime="2025-12-09 03:14:31.030347092 +0000 UTC m=+152.739652518" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.046129 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" event={"ID":"c633b9b3-8368-4de4-94c9-a1220ec6f07e","Type":"ContainerStarted","Data":"f43b3a3e5849aa57e89b344ca6750651a65045e2a0b8dff46874d043998ceae0"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.099372 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" podStartSLOduration=128.099348859 podStartE2EDuration="2m8.099348859s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.098615728 +0000 UTC m=+152.807921164" watchObservedRunningTime="2025-12-09 03:14:31.099348859 +0000 UTC m=+152.808654285" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.109103 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.110785 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:31.610747699 +0000 UTC m=+153.320053125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.152473 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" event={"ID":"c9a28313-282c-4ab6-a49f-e68fee577ba7","Type":"ContainerStarted","Data":"ea6be6d1edf84563edc15aaf3c319aeb9336d584e34170611910783ff58ab98a"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.152522 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" event={"ID":"c9a28313-282c-4ab6-a49f-e68fee577ba7","Type":"ContainerStarted","Data":"e7cfe8fc938060537ba0127a15cfa668427a160a6a9178d9fdc623e1d8d5a20f"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.181134 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" event={"ID":"e2bcd744-9d24-446e-b445-b32b262e3093","Type":"ContainerStarted","Data":"e9b40e176ab92c7353f95a7f7761236163f3500afdb1bdbaa3ed67dadca51725"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.209174 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-np46v" podStartSLOduration=128.209159445 podStartE2EDuration="2m8.209159445s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.20602834 +0000 UTC m=+152.915333766" watchObservedRunningTime="2025-12-09 03:14:31.209159445 +0000 UTC m=+152.918464871" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.215117 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zxkrl" event={"ID":"d2483b49-abc4-447a-9896-90574a532a13","Type":"ContainerStarted","Data":"a52e6fff7995d8345eadec68316d6a372af4ec652daf7bed174f52e15a07880f"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.215811 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.217980 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:31.717957904 +0000 UTC m=+153.427263530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.235841 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" event={"ID":"14f5010c-daa1-4ded-ab26-58e171159446","Type":"ContainerStarted","Data":"fc9d78f4de26819e7d4c9eb3ec89e2fb938843133059a71bf918a6d2dfc4c188"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.236883 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.245792 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p" event={"ID":"07235424-e14e-42c1-9508-746e61b3a531","Type":"ContainerStarted","Data":"0bda72495e0c89d73fa6b8f989be1d3b4d531a4ee3bac96578aae806dc9fa994"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.254251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" event={"ID":"0af4a108-cb4c-4537-859d-5ca0874be360","Type":"ContainerStarted","Data":"1a29b9bbf55370fe3a1f34b52ed33053bf8580c22892c6e3e1f7bbfeef1142f7"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.255687 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.275161 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.275901 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.285451 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-brcxt" event={"ID":"3fe6598b-ddd1-4382-b40a-2389a26958db","Type":"ContainerStarted","Data":"1b88aed8f436b4b238b862333551b944392e15198d0ef905cfb681c921a850e1"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.285497 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-brcxt" event={"ID":"3fe6598b-ddd1-4382-b40a-2389a26958db","Type":"ContainerStarted","Data":"166da72d670c1907fe8752064f1796bd35bcb66452ccf20b21edb51b2cdf1875"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.291553 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" event={"ID":"a0ba986e-97eb-4b03-ac68-e959e127efe6","Type":"ContainerStarted","Data":"0757c93cd619156880c48f107d05a1ca0ba8eeaea71b04c2908d9a4082ae3e87"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.293983 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" event={"ID":"f839f7fc-1e7f-4c71-9c02-f456ffacb094","Type":"ContainerStarted","Data":"12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.294567 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.312109 4766 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-swbbr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.312165 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" podUID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.316561 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.316958 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:31.816939995 +0000 UTC m=+153.526245421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.344277 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" event={"ID":"dcdfbbc7-e90a-4e4e-a029-7b1e9c013e8f","Type":"ContainerStarted","Data":"6f17a2d131e596e3297573c9a4bf1d558bd98ad914a4e17b22d69524bb2a195a"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.357919 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhkj2" podStartSLOduration=128.357894499 podStartE2EDuration="2m8.357894499s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.289048196 +0000 UTC m=+152.998353622" watchObservedRunningTime="2025-12-09 03:14:31.357894499 +0000 UTC m=+153.067199925" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.359881 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-456mp" podStartSLOduration=128.359872073 podStartE2EDuration="2m8.359872073s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.337166255 +0000 UTC m=+153.046471681" watchObservedRunningTime="2025-12-09 03:14:31.359872073 +0000 UTC m=+153.069177499" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.386430 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" event={"ID":"f12f58f9-e382-49ce-843a-c9da746a5b80","Type":"ContainerStarted","Data":"cd9a492f8ccaa7f371cfca667db0ead727c225d7168762ac5a4c78a536b2b40d"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.389017 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" event={"ID":"b57e3485-ffb5-46c1-b8d6-2e0296b85fab","Type":"ContainerStarted","Data":"703d5711eb0bd64eccef7cf273f21308b8e67a05b5225fbf8f645fe8cc618040"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.400951 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82txl" event={"ID":"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed","Type":"ContainerStarted","Data":"8b5ba76ca625ba2ae30bbaa200199cfbab668094f1aa1c73bf98612f1e477253"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.419477 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" event={"ID":"6743e7ca-f9d2-4267-ad68-8a35030cb43c","Type":"ContainerStarted","Data":"be29e6568da2160fc882f0d09b0c0e36bd93e67692dd5b1a84b3a01552391303"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.419979 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.424167 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" event={"ID":"36b72bd7-a396-448c-a04e-7f10fdf20d03","Type":"ContainerStarted","Data":"7f65df292c7f3beec31eaf68ba388a3e215abf76894a6eb2993c43ab3c80305d"} Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.430116 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:31.929842105 +0000 UTC m=+153.639147711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.442774 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" event={"ID":"6a702e81-477c-44a0-909b-520ddc169ca8","Type":"ContainerStarted","Data":"5a266d1dff77a0fadb03723d221e2e9fb158a3259b2954bd292974d63fe977e3"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.444060 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jqb5p" podStartSLOduration=128.444034891 podStartE2EDuration="2m8.444034891s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.442411897 +0000 UTC m=+153.151717333" watchObservedRunningTime="2025-12-09 03:14:31.444034891 +0000 UTC m=+153.153340317" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.453927 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjp5g" podStartSLOduration=128.453907999 podStartE2EDuration="2m8.453907999s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.402860771 +0000 UTC m=+153.112166197" watchObservedRunningTime="2025-12-09 03:14:31.453907999 +0000 UTC m=+153.163213425" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.482819 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" event={"ID":"00c0f7fd-e9af-4903-9eda-cb613560325a","Type":"ContainerStarted","Data":"1245a73fbedfa9999734cefebd41900a937361dd9e6208e5345af5f5de3bbfb0"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.503330 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" event={"ID":"2227c3d3-0ba1-4f48-969e-6572a2ef618e","Type":"ContainerStarted","Data":"84fb3fcdf7d402ebd51d862ec4a080929abf56afa58cb13d5782f75ab260d311"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.503392 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" event={"ID":"2227c3d3-0ba1-4f48-969e-6572a2ef618e","Type":"ContainerStarted","Data":"f1cd06a1ef410c65c0cfe1385403d9a6392aeaf142e954d14ce2da77776a0c33"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.521737 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.522174 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.022156275 +0000 UTC m=+153.731461701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.525662 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4p5j6" podStartSLOduration=127.525629709 podStartE2EDuration="2m7.525629709s" podCreationTimestamp="2025-12-09 03:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.471181139 +0000 UTC m=+153.180486565" watchObservedRunningTime="2025-12-09 03:14:31.525629709 +0000 UTC m=+153.234935135" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.525790 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-6pzhg" podStartSLOduration=128.525786894 podStartE2EDuration="2m8.525786894s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.517861788 +0000 UTC m=+153.227167214" watchObservedRunningTime="2025-12-09 03:14:31.525786894 +0000 UTC m=+153.235092320" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.529662 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" event={"ID":"6c839bfc-fc6c-4a74-a879-c9d24b335eb5","Type":"ContainerStarted","Data":"8752c4463fe1060b45161546097db6efc52d899206bd3a998aac0b0a8eb8ee91"} Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.530421 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.539891 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-fpj7h container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.539964 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fpj7h" podUID="73cae1ca-ac0e-4f90-92ec-d4077800d063" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.548932 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxnx5" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.558950 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfqbb" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.599617 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" podStartSLOduration=128.59959307 podStartE2EDuration="2m8.59959307s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.558594686 +0000 UTC m=+153.267900112" watchObservedRunningTime="2025-12-09 03:14:31.59959307 +0000 UTC m=+153.308898486" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.601916 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" podStartSLOduration=128.601908213 podStartE2EDuration="2m8.601908213s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.59626879 +0000 UTC m=+153.305574226" watchObservedRunningTime="2025-12-09 03:14:31.601908213 +0000 UTC m=+153.311213639" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.625844 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.630552 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.130516511 +0000 UTC m=+153.839821937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.638468 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" podStartSLOduration=128.638431546 podStartE2EDuration="2m8.638431546s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.637623155 +0000 UTC m=+153.346928581" watchObservedRunningTime="2025-12-09 03:14:31.638431546 +0000 UTC m=+153.347736972" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.719271 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" podStartSLOduration=128.719245994 podStartE2EDuration="2m8.719245994s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.719061049 +0000 UTC m=+153.428366485" watchObservedRunningTime="2025-12-09 03:14:31.719245994 +0000 UTC m=+153.428551420" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.719849 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lkj7" podStartSLOduration=128.71983801 podStartE2EDuration="2m8.71983801s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.673906401 +0000 UTC m=+153.383211827" watchObservedRunningTime="2025-12-09 03:14:31.71983801 +0000 UTC m=+153.429143436" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.729113 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.729790 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.22975021 +0000 UTC m=+153.939055636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.820724 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" podStartSLOduration=128.820697902 podStartE2EDuration="2m8.820697902s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.806778564 +0000 UTC m=+153.516084010" watchObservedRunningTime="2025-12-09 03:14:31.820697902 +0000 UTC m=+153.530003328" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.833450 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.833855 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.3338389 +0000 UTC m=+154.043144326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.890307 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:31 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:31 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:31 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.890399 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.908881 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8xwh7" podStartSLOduration=128.908839609 podStartE2EDuration="2m8.908839609s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:31.848727824 +0000 UTC m=+153.558033250" watchObservedRunningTime="2025-12-09 03:14:31.908839609 +0000 UTC m=+153.618145035" Dec 09 03:14:31 crc kubenswrapper[4766]: I1209 03:14:31.945388 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:31 crc kubenswrapper[4766]: E1209 03:14:31.945763 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.445739643 +0000 UTC m=+154.155045069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.048611 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.048962 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.548948239 +0000 UTC m=+154.258253665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.150512 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.151358 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.651332632 +0000 UTC m=+154.360638058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.252902 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.253796 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.753781038 +0000 UTC m=+154.463086464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.354817 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.355449 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.855426332 +0000 UTC m=+154.564731758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.456791 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.457132 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:32.957120727 +0000 UTC m=+154.666426153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.558918 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.559327 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.059301996 +0000 UTC m=+154.768607452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.582118 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-77sxr" event={"ID":"6743e7ca-f9d2-4267-ad68-8a35030cb43c","Type":"ContainerStarted","Data":"a82b976167b24cff67451c94f25f5b573f3339eecc74d89725dfa0817267d07a"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.585158 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-brcxt" event={"ID":"3fe6598b-ddd1-4382-b40a-2389a26958db","Type":"ContainerStarted","Data":"a345eba4f8ab68a2a79ba34a42e59d6c29c01d90d658ff851973f12a7c04596c"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.586527 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.591599 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" event={"ID":"36b72bd7-a396-448c-a04e-7f10fdf20d03","Type":"ContainerStarted","Data":"1aa2acd5c5bb1638dc6c71e41be124089d3056acd1e3b8fcab83aec77bd83fc1"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.591657 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" event={"ID":"36b72bd7-a396-448c-a04e-7f10fdf20d03","Type":"ContainerStarted","Data":"d772b7541f4f6d070af6310a8827a4395a14d100898f37d6c581c5eae18fb073"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.601545 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82txl" event={"ID":"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed","Type":"ContainerStarted","Data":"08f8e81b66e5363c699189a6a70ee4cca8f5e5cf72ed705a9bec462145aa6008"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.601599 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82txl" event={"ID":"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed","Type":"ContainerStarted","Data":"2e87d40a62094dafc629a8fcc7091949bbdb3f4e0a9bc5d162bb35f5fa3fcdde"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.604299 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" event={"ID":"6a702e81-477c-44a0-909b-520ddc169ca8","Type":"ContainerStarted","Data":"cf4b67a0672417f1592f7203ca0136b025c91f797c2d7030f9eb44efc012e3a7"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.604325 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" event={"ID":"6a702e81-477c-44a0-909b-520ddc169ca8","Type":"ContainerStarted","Data":"00578c2b8b6a3d56a26ce879ecceae935640f990c81fff2f036f4144407a5a5b"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.621372 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" event={"ID":"b7d85e7b-038a-4c18-96d6-4aa4c811ff44","Type":"ContainerStarted","Data":"fb524157a119eb28ff25811e24a8d76edc96c0c941f78ee4ccc1bc47bcea5f9e"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.622065 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.624458 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-94j6f" event={"ID":"00c0f7fd-e9af-4903-9eda-cb613560325a","Type":"ContainerStarted","Data":"76e946cdb36bb7aa031af1d21990f9a15c88ef2dcedb5ec510360980e0cd2741"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.633619 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" event={"ID":"d0fdcd8a-ec05-47e3-8509-ec4c57a3c5e3","Type":"ContainerStarted","Data":"9227532a5b48a6d22cc07ba9819f1235b4a15629c898ec10e14a34da75934bd9"} Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.639742 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-brcxt" podStartSLOduration=9.639724113 podStartE2EDuration="9.639724113s" podCreationTimestamp="2025-12-09 03:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:32.638124139 +0000 UTC m=+154.347429565" watchObservedRunningTime="2025-12-09 03:14:32.639724113 +0000 UTC m=+154.349029539" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.640994 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.660053 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.673381 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.173356967 +0000 UTC m=+154.882662393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.680713 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vfvvg" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.684717 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9wx6d" podStartSLOduration=129.684685735 podStartE2EDuration="2m9.684685735s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:32.682550887 +0000 UTC m=+154.391856313" watchObservedRunningTime="2025-12-09 03:14:32.684685735 +0000 UTC m=+154.393991161" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.730657 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c75f2" podStartSLOduration=129.730627654 podStartE2EDuration="2m9.730627654s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:32.717229329 +0000 UTC m=+154.426534765" watchObservedRunningTime="2025-12-09 03:14:32.730627654 +0000 UTC m=+154.439933080" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.761132 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.761534 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.261508063 +0000 UTC m=+154.970813489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.786160 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jpckj" podStartSLOduration=129.786136593 podStartE2EDuration="2m9.786136593s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:32.785705922 +0000 UTC m=+154.495011348" watchObservedRunningTime="2025-12-09 03:14:32.786136593 +0000 UTC m=+154.495442019" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.786654 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" podStartSLOduration=129.786649447 podStartE2EDuration="2m9.786649447s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:32.75474904 +0000 UTC m=+154.464054466" watchObservedRunningTime="2025-12-09 03:14:32.786649447 +0000 UTC m=+154.495954873" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.864661 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.865043 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.365030648 +0000 UTC m=+155.074336074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.886919 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:32 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:32 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:32 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.886996 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:32 crc kubenswrapper[4766]: I1209 03:14:32.965437 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:32 crc kubenswrapper[4766]: E1209 03:14:32.965692 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.465666645 +0000 UTC m=+155.174972071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.000760 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j6mbs"] Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.001885 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.030855 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.055729 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6mbs"] Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.068710 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfm7z\" (UniqueName: \"kubernetes.io/projected/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-kube-api-access-bfm7z\") pod \"community-operators-j6mbs\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.068778 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.068813 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-utilities\") pod \"community-operators-j6mbs\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.068832 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-catalog-content\") pod \"community-operators-j6mbs\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.069299 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.569283462 +0000 UTC m=+155.278588888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.163256 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6vt2b"] Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.164493 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.169811 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.170124 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-utilities\") pod \"community-operators-j6mbs\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.170193 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.670156795 +0000 UTC m=+155.379462211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.170263 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-catalog-content\") pod \"community-operators-j6mbs\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.170506 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfm7z\" (UniqueName: \"kubernetes.io/projected/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-kube-api-access-bfm7z\") pod \"community-operators-j6mbs\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.170557 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.170902 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.670894605 +0000 UTC m=+155.380200031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.171176 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-utilities\") pod \"community-operators-j6mbs\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.171679 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.180866 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-catalog-content\") pod \"community-operators-j6mbs\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.196524 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vt2b"] Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.214094 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfm7z\" (UniqueName: \"kubernetes.io/projected/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-kube-api-access-bfm7z\") pod \"community-operators-j6mbs\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.273071 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.273293 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-utilities\") pod \"certified-operators-6vt2b\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.273394 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrzt\" (UniqueName: \"kubernetes.io/projected/dd777685-39e7-46bb-824f-f19ceaa179ca-kube-api-access-dlrzt\") pod \"certified-operators-6vt2b\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.273430 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-catalog-content\") pod \"certified-operators-6vt2b\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.273569 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.773548106 +0000 UTC m=+155.482853532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.315684 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.348925 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l2ll7"] Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.350039 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.366326 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2ll7"] Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.378353 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.378413 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrzt\" (UniqueName: \"kubernetes.io/projected/dd777685-39e7-46bb-824f-f19ceaa179ca-kube-api-access-dlrzt\") pod \"certified-operators-6vt2b\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.378451 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-catalog-content\") pod \"certified-operators-6vt2b\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.378472 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-utilities\") pod \"certified-operators-6vt2b\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.378904 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-utilities\") pod \"certified-operators-6vt2b\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.379143 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-catalog-content\") pod \"certified-operators-6vt2b\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.379441 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.879414015 +0000 UTC m=+155.588719441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.399278 4766 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.423971 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrzt\" (UniqueName: \"kubernetes.io/projected/dd777685-39e7-46bb-824f-f19ceaa179ca-kube-api-access-dlrzt\") pod \"certified-operators-6vt2b\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.482121 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.482506 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4smpj\" (UniqueName: \"kubernetes.io/projected/b49b8e85-c4e7-4741-b61b-af26993717c2-kube-api-access-4smpj\") pod \"community-operators-l2ll7\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.482565 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-catalog-content\") pod \"community-operators-l2ll7\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.482586 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-utilities\") pod \"community-operators-l2ll7\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.482652 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:33.982629211 +0000 UTC m=+155.691934637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.542287 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zzft2"] Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.543614 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.558945 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzft2"] Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.560584 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.595195 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.595265 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2znl\" (UniqueName: \"kubernetes.io/projected/2b4dc351-fc6c-478e-8695-e6e778b7be24-kube-api-access-m2znl\") pod \"certified-operators-zzft2\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.595288 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-utilities\") pod \"certified-operators-zzft2\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.595322 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-catalog-content\") pod \"community-operators-l2ll7\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.595343 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-utilities\") pod \"community-operators-l2ll7\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.595365 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-catalog-content\") pod \"certified-operators-zzft2\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.595433 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4smpj\" (UniqueName: \"kubernetes.io/projected/b49b8e85-c4e7-4741-b61b-af26993717c2-kube-api-access-4smpj\") pod \"community-operators-l2ll7\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.595789 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:34.095746697 +0000 UTC m=+155.805052123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.596461 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-catalog-content\") pod \"community-operators-l2ll7\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.596549 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-utilities\") pod \"community-operators-l2ll7\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.644637 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4smpj\" (UniqueName: \"kubernetes.io/projected/b49b8e85-c4e7-4741-b61b-af26993717c2-kube-api-access-4smpj\") pod \"community-operators-l2ll7\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.699906 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.700169 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2znl\" (UniqueName: \"kubernetes.io/projected/2b4dc351-fc6c-478e-8695-e6e778b7be24-kube-api-access-m2znl\") pod \"certified-operators-zzft2\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.700204 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-utilities\") pod \"certified-operators-zzft2\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.700255 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-catalog-content\") pod \"certified-operators-zzft2\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.700764 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-catalog-content\") pod \"certified-operators-zzft2\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.701202 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:34.201182033 +0000 UTC m=+155.910487459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.701806 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-utilities\") pod \"certified-operators-zzft2\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.709710 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.731261 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2znl\" (UniqueName: \"kubernetes.io/projected/2b4dc351-fc6c-478e-8695-e6e778b7be24-kube-api-access-m2znl\") pod \"certified-operators-zzft2\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.756494 4766 generic.go:334] "Generic (PLEG): container finished" podID="b57e3485-ffb5-46c1-b8d6-2e0296b85fab" containerID="703d5711eb0bd64eccef7cf273f21308b8e67a05b5225fbf8f645fe8cc618040" exitCode=0 Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.756574 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" event={"ID":"b57e3485-ffb5-46c1-b8d6-2e0296b85fab","Type":"ContainerDied","Data":"703d5711eb0bd64eccef7cf273f21308b8e67a05b5225fbf8f645fe8cc618040"} Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.782411 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82txl" event={"ID":"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed","Type":"ContainerStarted","Data":"60522069f491f1bab0103dbf77274d2ec0443ddb75f8088a167dbdc042ffc002"} Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.801330 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.801777 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:34.301761798 +0000 UTC m=+156.011067224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.818556 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kmq9l" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.843854 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-82txl" podStartSLOduration=10.843828512 podStartE2EDuration="10.843828512s" podCreationTimestamp="2025-12-09 03:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:33.841676294 +0000 UTC m=+155.550981720" watchObservedRunningTime="2025-12-09 03:14:33.843828512 +0000 UTC m=+155.553133938" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.871507 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6mbs"] Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.886548 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:33 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:33 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:33 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.886631 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.912726 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:33 crc kubenswrapper[4766]: E1209 03:14:33.916169 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:34.416145268 +0000 UTC m=+156.125450694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:33 crc kubenswrapper[4766]: I1209 03:14:33.919963 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.019161 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:34 crc kubenswrapper[4766]: E1209 03:14:34.019630 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:34.519612942 +0000 UTC m=+156.228918368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.025700 4766 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-09T03:14:33.399308706Z","Handler":null,"Name":""} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.121117 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:34 crc kubenswrapper[4766]: E1209 03:14:34.121383 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 03:14:34.621304486 +0000 UTC m=+156.330609902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.121892 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:34 crc kubenswrapper[4766]: E1209 03:14:34.122409 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 03:14:34.622390337 +0000 UTC m=+156.331695763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gljdx" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.121538 4766 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.122666 4766 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.147827 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vt2b"] Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.222675 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.228629 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.315426 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzft2"] Dec 09 03:14:34 crc kubenswrapper[4766]: W1209 03:14:34.319079 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b4dc351_fc6c_478e_8695_e6e778b7be24.slice/crio-c3a45893c01effe86ad73ab0ea23c2e5152a8b385cbadf773318da227a69208b WatchSource:0}: Error finding container c3a45893c01effe86ad73ab0ea23c2e5152a8b385cbadf773318da227a69208b: Status 404 returned error can't find the container with id c3a45893c01effe86ad73ab0ea23c2e5152a8b385cbadf773318da227a69208b Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.323833 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.425660 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2ll7"] Dec 09 03:14:34 crc kubenswrapper[4766]: W1209 03:14:34.436051 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb49b8e85_c4e7_4741_b61b_af26993717c2.slice/crio-442ca3064c0376513aa35143db734c2e4abf3c41c2258d7058fa605c71ada9c2 WatchSource:0}: Error finding container 442ca3064c0376513aa35143db734c2e4abf3c41c2258d7058fa605c71ada9c2: Status 404 returned error can't find the container with id 442ca3064c0376513aa35143db734c2e4abf3c41c2258d7058fa605c71ada9c2 Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.551397 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.551471 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.604619 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gljdx\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.788616 4766 generic.go:334] "Generic (PLEG): container finished" podID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerID="b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0" exitCode=0 Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.788735 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6mbs" event={"ID":"8a05d77b-68a0-4e96-b71e-f5168ee4d38f","Type":"ContainerDied","Data":"b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0"} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.789190 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6mbs" event={"ID":"8a05d77b-68a0-4e96-b71e-f5168ee4d38f","Type":"ContainerStarted","Data":"55040f3463f3b9b8a2b1a8bf0a19c46d78109380db06d5b65cc9344f2237e8af"} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.791805 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-82txl" event={"ID":"ed5281e5-9357-4e0d-bb0f-ab3a1b177bed","Type":"ContainerStarted","Data":"2f31e5a5768b6f0a1a071cdfd4437e41bb93fb8ce153906eecf4ca08694feb34"} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.791896 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.795592 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerID="08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb" exitCode=0 Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.795666 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzft2" event={"ID":"2b4dc351-fc6c-478e-8695-e6e778b7be24","Type":"ContainerDied","Data":"08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb"} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.795698 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzft2" event={"ID":"2b4dc351-fc6c-478e-8695-e6e778b7be24","Type":"ContainerStarted","Data":"c3a45893c01effe86ad73ab0ea23c2e5152a8b385cbadf773318da227a69208b"} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.799845 4766 generic.go:334] "Generic (PLEG): container finished" podID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerID="082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e" exitCode=0 Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.799943 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vt2b" event={"ID":"dd777685-39e7-46bb-824f-f19ceaa179ca","Type":"ContainerDied","Data":"082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e"} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.799985 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vt2b" event={"ID":"dd777685-39e7-46bb-824f-f19ceaa179ca","Type":"ContainerStarted","Data":"2f2f711b747d17009217303fe11c2e2cdc3f1d8fd0cc5d4af7d4166289d930a6"} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.805559 4766 generic.go:334] "Generic (PLEG): container finished" podID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerID="df807debd83b0d0876f0c7fe0fb95a759bfeb552bcee7a90e8d3a6b2e3b347d5" exitCode=0 Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.807199 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2ll7" event={"ID":"b49b8e85-c4e7-4741-b61b-af26993717c2","Type":"ContainerDied","Data":"df807debd83b0d0876f0c7fe0fb95a759bfeb552bcee7a90e8d3a6b2e3b347d5"} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.807257 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2ll7" event={"ID":"b49b8e85-c4e7-4741-b61b-af26993717c2","Type":"ContainerStarted","Data":"442ca3064c0376513aa35143db734c2e4abf3c41c2258d7058fa605c71ada9c2"} Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.815103 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.860588 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.883932 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:34 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:34 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:34 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.884033 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.939085 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pxzck"] Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.950701 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.952539 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 03:14:34 crc kubenswrapper[4766]: I1209 03:14:34.955829 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxzck"] Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.023090 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.106396 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gljdx"] Dec 09 03:14:35 crc kubenswrapper[4766]: W1209 03:14:35.115329 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e81ed6_e457_4e59_a25e_b40dceea3cfd.slice/crio-4ce2e10d3686ea19c2f372064a9ba48d36cbdfa5a5b6ae120b2874bf56805f76 WatchSource:0}: Error finding container 4ce2e10d3686ea19c2f372064a9ba48d36cbdfa5a5b6ae120b2874bf56805f76: Status 404 returned error can't find the container with id 4ce2e10d3686ea19c2f372064a9ba48d36cbdfa5a5b6ae120b2874bf56805f76 Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.146811 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-secret-volume\") pod \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.146903 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f55j\" (UniqueName: \"kubernetes.io/projected/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-kube-api-access-8f55j\") pod \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.146959 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-config-volume\") pod \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\" (UID: \"b57e3485-ffb5-46c1-b8d6-2e0296b85fab\") " Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.147069 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-utilities\") pod \"redhat-marketplace-pxzck\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.147146 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8f2q\" (UniqueName: \"kubernetes.io/projected/b2c59da0-4314-401f-9b47-cd3df45f4d26-kube-api-access-h8f2q\") pod \"redhat-marketplace-pxzck\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.147215 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-catalog-content\") pod \"redhat-marketplace-pxzck\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.147576 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-config-volume" (OuterVolumeSpecName: "config-volume") pod "b57e3485-ffb5-46c1-b8d6-2e0296b85fab" (UID: "b57e3485-ffb5-46c1-b8d6-2e0296b85fab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.155146 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-kube-api-access-8f55j" (OuterVolumeSpecName: "kube-api-access-8f55j") pod "b57e3485-ffb5-46c1-b8d6-2e0296b85fab" (UID: "b57e3485-ffb5-46c1-b8d6-2e0296b85fab"). InnerVolumeSpecName "kube-api-access-8f55j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.155248 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b57e3485-ffb5-46c1-b8d6-2e0296b85fab" (UID: "b57e3485-ffb5-46c1-b8d6-2e0296b85fab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.248425 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-catalog-content\") pod \"redhat-marketplace-pxzck\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.248514 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-utilities\") pod \"redhat-marketplace-pxzck\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.248557 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8f2q\" (UniqueName: \"kubernetes.io/projected/b2c59da0-4314-401f-9b47-cd3df45f4d26-kube-api-access-h8f2q\") pod \"redhat-marketplace-pxzck\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.248601 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.248616 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f55j\" (UniqueName: \"kubernetes.io/projected/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-kube-api-access-8f55j\") on node \"crc\" DevicePath \"\"" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.248628 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b57e3485-ffb5-46c1-b8d6-2e0296b85fab-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.249066 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-catalog-content\") pod \"redhat-marketplace-pxzck\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.249466 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-utilities\") pod \"redhat-marketplace-pxzck\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.272658 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8f2q\" (UniqueName: \"kubernetes.io/projected/b2c59da0-4314-401f-9b47-cd3df45f4d26-kube-api-access-h8f2q\") pod \"redhat-marketplace-pxzck\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.285036 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.334941 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhkk"] Dec 09 03:14:35 crc kubenswrapper[4766]: E1209 03:14:35.335283 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57e3485-ffb5-46c1-b8d6-2e0296b85fab" containerName="collect-profiles" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.335301 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57e3485-ffb5-46c1-b8d6-2e0296b85fab" containerName="collect-profiles" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.335439 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57e3485-ffb5-46c1-b8d6-2e0296b85fab" containerName="collect-profiles" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.336454 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.350616 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-utilities\") pod \"redhat-marketplace-9bhkk\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.350677 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dlcp\" (UniqueName: \"kubernetes.io/projected/2f459552-2bcd-4b18-98eb-7e523f6d69d8-kube-api-access-6dlcp\") pod \"redhat-marketplace-9bhkk\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.350731 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-catalog-content\") pod \"redhat-marketplace-9bhkk\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.356553 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhkk"] Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.415938 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.416279 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.426781 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.452335 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-utilities\") pod \"redhat-marketplace-9bhkk\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.452431 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dlcp\" (UniqueName: \"kubernetes.io/projected/2f459552-2bcd-4b18-98eb-7e523f6d69d8-kube-api-access-6dlcp\") pod \"redhat-marketplace-9bhkk\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.452565 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-catalog-content\") pod \"redhat-marketplace-9bhkk\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.454735 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-catalog-content\") pod \"redhat-marketplace-9bhkk\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.458702 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-utilities\") pod \"redhat-marketplace-9bhkk\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.485350 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dlcp\" (UniqueName: \"kubernetes.io/projected/2f459552-2bcd-4b18-98eb-7e523f6d69d8-kube-api-access-6dlcp\") pod \"redhat-marketplace-9bhkk\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.539948 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxzck"] Dec 09 03:14:35 crc kubenswrapper[4766]: W1209 03:14:35.556523 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2c59da0_4314_401f_9b47_cd3df45f4d26.slice/crio-8c315f27d05f11ea8219aed9b0e9a1f2ff87e187ad430bd868f9ec46a9e5b24b WatchSource:0}: Error finding container 8c315f27d05f11ea8219aed9b0e9a1f2ff87e187ad430bd868f9ec46a9e5b24b: Status 404 returned error can't find the container with id 8c315f27d05f11ea8219aed9b0e9a1f2ff87e187ad430bd868f9ec46a9e5b24b Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.658666 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.827815 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" event={"ID":"b57e3485-ffb5-46c1-b8d6-2e0296b85fab","Type":"ContainerDied","Data":"b8b692e59d2abaa4c2a28c89274e3ebf53d4ce08a8bf96a533dc40fb2547302f"} Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.827866 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8b692e59d2abaa4c2a28c89274e3ebf53d4ce08a8bf96a533dc40fb2547302f" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.827972 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.853092 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" event={"ID":"59e81ed6-e457-4e59-a25e-b40dceea3cfd","Type":"ContainerStarted","Data":"2d97322ab85d0f1515da078d74c850d07cf70be43283fee04800e7a91b430a82"} Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.853168 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" event={"ID":"59e81ed6-e457-4e59-a25e-b40dceea3cfd","Type":"ContainerStarted","Data":"4ce2e10d3686ea19c2f372064a9ba48d36cbdfa5a5b6ae120b2874bf56805f76"} Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.853242 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.858376 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxzck" event={"ID":"b2c59da0-4314-401f-9b47-cd3df45f4d26","Type":"ContainerStarted","Data":"8c315f27d05f11ea8219aed9b0e9a1f2ff87e187ad430bd868f9ec46a9e5b24b"} Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.863909 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-m8mqz" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.876949 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" podStartSLOduration=132.876922103 podStartE2EDuration="2m12.876922103s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:14:35.875356811 +0000 UTC m=+157.584662247" watchObservedRunningTime="2025-12-09 03:14:35.876922103 +0000 UTC m=+157.586227529" Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.887600 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:35 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:35 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:35 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:35 crc kubenswrapper[4766]: I1209 03:14:35.887672 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.188547 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.189046 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.193462 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qnkzl"] Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.199205 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.201587 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-fpj7h container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.201606 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-fpj7h container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.201658 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fpj7h" podUID="73cae1ca-ac0e-4f90-92ec-d4077800d063" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.201688 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-fpj7h" podUID="73cae1ca-ac0e-4f90-92ec-d4077800d063" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.209511 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.209754 4766 patch_prober.go:28] interesting pod/console-f9d7485db-ss24g container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.209839 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ss24g" podUID="34120810-df87-4443-a2f7-16982e46027d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.253855 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnkzl"] Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.318754 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhkk"] Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.373217 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tmvk\" (UniqueName: \"kubernetes.io/projected/9fee64f2-1434-4082-9674-088e4d93cb9a-kube-api-access-8tmvk\") pod \"redhat-operators-qnkzl\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.373296 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-catalog-content\") pod \"redhat-operators-qnkzl\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.373333 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-utilities\") pod \"redhat-operators-qnkzl\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: W1209 03:14:36.440952 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f459552_2bcd_4b18_98eb_7e523f6d69d8.slice/crio-a7fbe9a4b82df2f0c408d5e5a483d705ef62fb3d9bc91bf33b37a275805d5ab9 WatchSource:0}: Error finding container a7fbe9a4b82df2f0c408d5e5a483d705ef62fb3d9bc91bf33b37a275805d5ab9: Status 404 returned error can't find the container with id a7fbe9a4b82df2f0c408d5e5a483d705ef62fb3d9bc91bf33b37a275805d5ab9 Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.475036 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tmvk\" (UniqueName: \"kubernetes.io/projected/9fee64f2-1434-4082-9674-088e4d93cb9a-kube-api-access-8tmvk\") pod \"redhat-operators-qnkzl\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.475091 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-catalog-content\") pod \"redhat-operators-qnkzl\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.475111 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-utilities\") pod \"redhat-operators-qnkzl\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.475815 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-utilities\") pod \"redhat-operators-qnkzl\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.476770 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-catalog-content\") pod \"redhat-operators-qnkzl\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.505089 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tmvk\" (UniqueName: \"kubernetes.io/projected/9fee64f2-1434-4082-9674-088e4d93cb9a-kube-api-access-8tmvk\") pod \"redhat-operators-qnkzl\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.538808 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.539367 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cltm8"] Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.540952 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.556844 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cltm8"] Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.679783 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-utilities\") pod \"redhat-operators-cltm8\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.679857 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-catalog-content\") pod \"redhat-operators-cltm8\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.679908 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/28803301-4e5b-4b88-b04a-1dfd4eb1d903-kube-api-access-s52x9\") pod \"redhat-operators-cltm8\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.762084 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.763253 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.768847 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.769106 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.773805 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.782039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-utilities\") pod \"redhat-operators-cltm8\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.782090 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-catalog-content\") pod \"redhat-operators-cltm8\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.782128 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/28803301-4e5b-4b88-b04a-1dfd4eb1d903-kube-api-access-s52x9\") pod \"redhat-operators-cltm8\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.783096 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-utilities\") pod \"redhat-operators-cltm8\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.783338 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-catalog-content\") pod \"redhat-operators-cltm8\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.808197 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/28803301-4e5b-4b88-b04a-1dfd4eb1d903-kube-api-access-s52x9\") pod \"redhat-operators-cltm8\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.877904 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.880518 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhkk" event={"ID":"2f459552-2bcd-4b18-98eb-7e523f6d69d8","Type":"ContainerStarted","Data":"8f9fe9800961b2f53dca719e32f940880e60c12070d3697c1f0f141bdc5dd5a0"} Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.880588 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhkk" event={"ID":"2f459552-2bcd-4b18-98eb-7e523f6d69d8","Type":"ContainerStarted","Data":"a7fbe9a4b82df2f0c408d5e5a483d705ef62fb3d9bc91bf33b37a275805d5ab9"} Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.881846 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:36 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:36 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:36 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.881882 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.885726 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e320261d-efc3-454d-b4cc-103017b22f7b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e320261d-efc3-454d-b4cc-103017b22f7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.885792 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e320261d-efc3-454d-b4cc-103017b22f7b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e320261d-efc3-454d-b4cc-103017b22f7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.915791 4766 generic.go:334] "Generic (PLEG): container finished" podID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerID="779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8" exitCode=0 Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.916390 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxzck" event={"ID":"b2c59da0-4314-401f-9b47-cd3df45f4d26","Type":"ContainerDied","Data":"779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8"} Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.922035 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.987312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e320261d-efc3-454d-b4cc-103017b22f7b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e320261d-efc3-454d-b4cc-103017b22f7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.987456 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e320261d-efc3-454d-b4cc-103017b22f7b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e320261d-efc3-454d-b4cc-103017b22f7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:36 crc kubenswrapper[4766]: I1209 03:14:36.996442 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e320261d-efc3-454d-b4cc-103017b22f7b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e320261d-efc3-454d-b4cc-103017b22f7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.029141 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e320261d-efc3-454d-b4cc-103017b22f7b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e320261d-efc3-454d-b4cc-103017b22f7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.086337 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.135334 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnkzl"] Dec 09 03:14:37 crc kubenswrapper[4766]: W1209 03:14:37.171505 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fee64f2_1434_4082_9674_088e4d93cb9a.slice/crio-30410c135c8bfa974e3f7889a8f69689180eabf8b7f4f60535ae276c1dd1d1a9 WatchSource:0}: Error finding container 30410c135c8bfa974e3f7889a8f69689180eabf8b7f4f60535ae276c1dd1d1a9: Status 404 returned error can't find the container with id 30410c135c8bfa974e3f7889a8f69689180eabf8b7f4f60535ae276c1dd1d1a9 Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.318638 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.318727 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.362640 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cltm8"] Dec 09 03:14:37 crc kubenswrapper[4766]: W1209 03:14:37.377750 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28803301_4e5b_4b88_b04a_1dfd4eb1d903.slice/crio-473f6fd500ff828a557cc45df7de003f33d82755e09c6fd4ecbccb32207d0e5e WatchSource:0}: Error finding container 473f6fd500ff828a557cc45df7de003f33d82755e09c6fd4ecbccb32207d0e5e: Status 404 returned error can't find the container with id 473f6fd500ff828a557cc45df7de003f33d82755e09c6fd4ecbccb32207d0e5e Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.482270 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.883771 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:37 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:37 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:37 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.883859 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.936489 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e320261d-efc3-454d-b4cc-103017b22f7b","Type":"ContainerStarted","Data":"0967591808e67af760d87ec9d769a2fb4d7621f5188a1465c31ed109f173f711"} Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.939623 4766 generic.go:334] "Generic (PLEG): container finished" podID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerID="c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d" exitCode=0 Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.939693 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cltm8" event={"ID":"28803301-4e5b-4b88-b04a-1dfd4eb1d903","Type":"ContainerDied","Data":"c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d"} Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.939754 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cltm8" event={"ID":"28803301-4e5b-4b88-b04a-1dfd4eb1d903","Type":"ContainerStarted","Data":"473f6fd500ff828a557cc45df7de003f33d82755e09c6fd4ecbccb32207d0e5e"} Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.951047 4766 generic.go:334] "Generic (PLEG): container finished" podID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerID="4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883" exitCode=0 Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.951160 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnkzl" event={"ID":"9fee64f2-1434-4082-9674-088e4d93cb9a","Type":"ContainerDied","Data":"4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883"} Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.951193 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnkzl" event={"ID":"9fee64f2-1434-4082-9674-088e4d93cb9a","Type":"ContainerStarted","Data":"30410c135c8bfa974e3f7889a8f69689180eabf8b7f4f60535ae276c1dd1d1a9"} Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.953901 4766 generic.go:334] "Generic (PLEG): container finished" podID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerID="8f9fe9800961b2f53dca719e32f940880e60c12070d3697c1f0f141bdc5dd5a0" exitCode=0 Dec 09 03:14:37 crc kubenswrapper[4766]: I1209 03:14:37.953941 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhkk" event={"ID":"2f459552-2bcd-4b18-98eb-7e523f6d69d8","Type":"ContainerDied","Data":"8f9fe9800961b2f53dca719e32f940880e60c12070d3697c1f0f141bdc5dd5a0"} Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.023256 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.024255 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.026585 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.031100 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.035955 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.108154 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81005d27-0219-4f00-8191-624550153be5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"81005d27-0219-4f00-8191-624550153be5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.108290 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81005d27-0219-4f00-8191-624550153be5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"81005d27-0219-4f00-8191-624550153be5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.209290 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81005d27-0219-4f00-8191-624550153be5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"81005d27-0219-4f00-8191-624550153be5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.209403 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81005d27-0219-4f00-8191-624550153be5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"81005d27-0219-4f00-8191-624550153be5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.209495 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81005d27-0219-4f00-8191-624550153be5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"81005d27-0219-4f00-8191-624550153be5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.231406 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81005d27-0219-4f00-8191-624550153be5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"81005d27-0219-4f00-8191-624550153be5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.347832 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.884147 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:38 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:38 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:38 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.885724 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.902744 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 03:14:38 crc kubenswrapper[4766]: W1209 03:14:38.949947 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod81005d27_0219_4f00_8191_624550153be5.slice/crio-a7506915960ed981990012aacc68c27a5bcd0ad84a501babb980fc47ba9d8262 WatchSource:0}: Error finding container a7506915960ed981990012aacc68c27a5bcd0ad84a501babb980fc47ba9d8262: Status 404 returned error can't find the container with id a7506915960ed981990012aacc68c27a5bcd0ad84a501babb980fc47ba9d8262 Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.984932 4766 generic.go:334] "Generic (PLEG): container finished" podID="e320261d-efc3-454d-b4cc-103017b22f7b" containerID="948e3c7c49a7e401e2be376fc81cb82ed4f06ec28c079a59701c582a91a06e96" exitCode=0 Dec 09 03:14:38 crc kubenswrapper[4766]: I1209 03:14:38.985053 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e320261d-efc3-454d-b4cc-103017b22f7b","Type":"ContainerDied","Data":"948e3c7c49a7e401e2be376fc81cb82ed4f06ec28c079a59701c582a91a06e96"} Dec 09 03:14:39 crc kubenswrapper[4766]: I1209 03:14:39.006760 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"81005d27-0219-4f00-8191-624550153be5","Type":"ContainerStarted","Data":"a7506915960ed981990012aacc68c27a5bcd0ad84a501babb980fc47ba9d8262"} Dec 09 03:14:39 crc kubenswrapper[4766]: I1209 03:14:39.880379 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:39 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:39 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:39 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:39 crc kubenswrapper[4766]: I1209 03:14:39.880904 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:40 crc kubenswrapper[4766]: I1209 03:14:40.575965 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:40 crc kubenswrapper[4766]: I1209 03:14:40.669420 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e320261d-efc3-454d-b4cc-103017b22f7b-kubelet-dir\") pod \"e320261d-efc3-454d-b4cc-103017b22f7b\" (UID: \"e320261d-efc3-454d-b4cc-103017b22f7b\") " Dec 09 03:14:40 crc kubenswrapper[4766]: I1209 03:14:40.669488 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e320261d-efc3-454d-b4cc-103017b22f7b-kube-api-access\") pod \"e320261d-efc3-454d-b4cc-103017b22f7b\" (UID: \"e320261d-efc3-454d-b4cc-103017b22f7b\") " Dec 09 03:14:40 crc kubenswrapper[4766]: I1209 03:14:40.669537 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e320261d-efc3-454d-b4cc-103017b22f7b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e320261d-efc3-454d-b4cc-103017b22f7b" (UID: "e320261d-efc3-454d-b4cc-103017b22f7b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:14:40 crc kubenswrapper[4766]: I1209 03:14:40.670034 4766 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e320261d-efc3-454d-b4cc-103017b22f7b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:14:40 crc kubenswrapper[4766]: I1209 03:14:40.689675 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e320261d-efc3-454d-b4cc-103017b22f7b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e320261d-efc3-454d-b4cc-103017b22f7b" (UID: "e320261d-efc3-454d-b4cc-103017b22f7b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:14:40 crc kubenswrapper[4766]: I1209 03:14:40.773723 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e320261d-efc3-454d-b4cc-103017b22f7b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 03:14:40 crc kubenswrapper[4766]: I1209 03:14:40.880883 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:40 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:40 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:40 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:40 crc kubenswrapper[4766]: I1209 03:14:40.880944 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:41 crc kubenswrapper[4766]: I1209 03:14:41.042530 4766 generic.go:334] "Generic (PLEG): container finished" podID="81005d27-0219-4f00-8191-624550153be5" containerID="2d4a8a5a11df97d181efe143fd66a42eb1367131803712dd21a79abaeeb45b84" exitCode=0 Dec 09 03:14:41 crc kubenswrapper[4766]: I1209 03:14:41.042621 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"81005d27-0219-4f00-8191-624550153be5","Type":"ContainerDied","Data":"2d4a8a5a11df97d181efe143fd66a42eb1367131803712dd21a79abaeeb45b84"} Dec 09 03:14:41 crc kubenswrapper[4766]: I1209 03:14:41.048690 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e320261d-efc3-454d-b4cc-103017b22f7b","Type":"ContainerDied","Data":"0967591808e67af760d87ec9d769a2fb4d7621f5188a1465c31ed109f173f711"} Dec 09 03:14:41 crc kubenswrapper[4766]: I1209 03:14:41.048785 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0967591808e67af760d87ec9d769a2fb4d7621f5188a1465c31ed109f173f711" Dec 09 03:14:41 crc kubenswrapper[4766]: I1209 03:14:41.048851 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 03:14:41 crc kubenswrapper[4766]: I1209 03:14:41.882429 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:41 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:41 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:41 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:41 crc kubenswrapper[4766]: I1209 03:14:41.882519 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.384130 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-brcxt" Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.452321 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.635277 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81005d27-0219-4f00-8191-624550153be5-kube-api-access\") pod \"81005d27-0219-4f00-8191-624550153be5\" (UID: \"81005d27-0219-4f00-8191-624550153be5\") " Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.635427 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81005d27-0219-4f00-8191-624550153be5-kubelet-dir\") pod \"81005d27-0219-4f00-8191-624550153be5\" (UID: \"81005d27-0219-4f00-8191-624550153be5\") " Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.635583 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81005d27-0219-4f00-8191-624550153be5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "81005d27-0219-4f00-8191-624550153be5" (UID: "81005d27-0219-4f00-8191-624550153be5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.636175 4766 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81005d27-0219-4f00-8191-624550153be5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.642925 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81005d27-0219-4f00-8191-624550153be5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "81005d27-0219-4f00-8191-624550153be5" (UID: "81005d27-0219-4f00-8191-624550153be5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.739277 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81005d27-0219-4f00-8191-624550153be5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.881586 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:42 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:42 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:42 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:42 crc kubenswrapper[4766]: I1209 03:14:42.881673 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:43 crc kubenswrapper[4766]: I1209 03:14:43.083926 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"81005d27-0219-4f00-8191-624550153be5","Type":"ContainerDied","Data":"a7506915960ed981990012aacc68c27a5bcd0ad84a501babb980fc47ba9d8262"} Dec 09 03:14:43 crc kubenswrapper[4766]: I1209 03:14:43.084302 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7506915960ed981990012aacc68c27a5bcd0ad84a501babb980fc47ba9d8262" Dec 09 03:14:43 crc kubenswrapper[4766]: I1209 03:14:43.084000 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 03:14:43 crc kubenswrapper[4766]: I1209 03:14:43.881295 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:43 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:43 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:43 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:43 crc kubenswrapper[4766]: I1209 03:14:43.881397 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:44 crc kubenswrapper[4766]: I1209 03:14:44.880146 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:44 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:44 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:44 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:44 crc kubenswrapper[4766]: I1209 03:14:44.881022 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:45 crc kubenswrapper[4766]: I1209 03:14:45.881294 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:45 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:45 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:45 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:45 crc kubenswrapper[4766]: I1209 03:14:45.881453 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:46 crc kubenswrapper[4766]: I1209 03:14:46.099754 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:46 crc kubenswrapper[4766]: I1209 03:14:46.108340 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c9eb693-99eb-4b02-b33a-26d506eeb3f1-metrics-certs\") pod \"network-metrics-daemon-z6qth\" (UID: \"5c9eb693-99eb-4b02-b33a-26d506eeb3f1\") " pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:46 crc kubenswrapper[4766]: I1209 03:14:46.187022 4766 patch_prober.go:28] interesting pod/console-f9d7485db-ss24g container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 09 03:14:46 crc kubenswrapper[4766]: I1209 03:14:46.187125 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ss24g" podUID="34120810-df87-4443-a2f7-16982e46027d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 09 03:14:46 crc kubenswrapper[4766]: I1209 03:14:46.214649 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-fpj7h" Dec 09 03:14:46 crc kubenswrapper[4766]: I1209 03:14:46.385673 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6qth" Dec 09 03:14:46 crc kubenswrapper[4766]: I1209 03:14:46.882647 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:46 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:46 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:46 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:46 crc kubenswrapper[4766]: I1209 03:14:46.882726 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:47 crc kubenswrapper[4766]: I1209 03:14:47.879735 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:47 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:47 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:47 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:47 crc kubenswrapper[4766]: I1209 03:14:47.879811 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:48 crc kubenswrapper[4766]: I1209 03:14:48.880349 4766 patch_prober.go:28] interesting pod/router-default-5444994796-8f855 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 03:14:48 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Dec 09 03:14:48 crc kubenswrapper[4766]: [+]process-running ok Dec 09 03:14:48 crc kubenswrapper[4766]: healthz check failed Dec 09 03:14:48 crc kubenswrapper[4766]: I1209 03:14:48.880430 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8f855" podUID="257bfa27-baf5-4470-84f1-71d06b37f763" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 03:14:49 crc kubenswrapper[4766]: I1209 03:14:49.882141 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:49 crc kubenswrapper[4766]: I1209 03:14:49.885993 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8f855" Dec 09 03:14:54 crc kubenswrapper[4766]: I1209 03:14:54.821397 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:14:55 crc kubenswrapper[4766]: I1209 03:14:55.121402 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 03:14:56 crc kubenswrapper[4766]: I1209 03:14:56.192700 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:56 crc kubenswrapper[4766]: I1209 03:14:56.199988 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:14:59 crc kubenswrapper[4766]: I1209 03:14:59.106415 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z6qth"] Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.142492 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm"] Dec 09 03:15:00 crc kubenswrapper[4766]: E1209 03:15:00.144862 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e320261d-efc3-454d-b4cc-103017b22f7b" containerName="pruner" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.144928 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e320261d-efc3-454d-b4cc-103017b22f7b" containerName="pruner" Dec 09 03:15:00 crc kubenswrapper[4766]: E1209 03:15:00.144944 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81005d27-0219-4f00-8191-624550153be5" containerName="pruner" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.144953 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="81005d27-0219-4f00-8191-624550153be5" containerName="pruner" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.145123 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="81005d27-0219-4f00-8191-624550153be5" containerName="pruner" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.145149 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e320261d-efc3-454d-b4cc-103017b22f7b" containerName="pruner" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.145824 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.150817 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.151311 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.166829 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm"] Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.239914 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c71c43-36b4-4147-b4c6-5e524c156343-config-volume\") pod \"collect-profiles-29420835-mn4qm\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.240048 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c71c43-36b4-4147-b4c6-5e524c156343-secret-volume\") pod \"collect-profiles-29420835-mn4qm\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.240273 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j2ck\" (UniqueName: \"kubernetes.io/projected/35c71c43-36b4-4147-b4c6-5e524c156343-kube-api-access-8j2ck\") pod \"collect-profiles-29420835-mn4qm\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.341871 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j2ck\" (UniqueName: \"kubernetes.io/projected/35c71c43-36b4-4147-b4c6-5e524c156343-kube-api-access-8j2ck\") pod \"collect-profiles-29420835-mn4qm\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.341959 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c71c43-36b4-4147-b4c6-5e524c156343-config-volume\") pod \"collect-profiles-29420835-mn4qm\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.341983 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c71c43-36b4-4147-b4c6-5e524c156343-secret-volume\") pod \"collect-profiles-29420835-mn4qm\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.343655 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c71c43-36b4-4147-b4c6-5e524c156343-config-volume\") pod \"collect-profiles-29420835-mn4qm\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.351123 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c71c43-36b4-4147-b4c6-5e524c156343-secret-volume\") pod \"collect-profiles-29420835-mn4qm\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.360324 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j2ck\" (UniqueName: \"kubernetes.io/projected/35c71c43-36b4-4147-b4c6-5e524c156343-kube-api-access-8j2ck\") pod \"collect-profiles-29420835-mn4qm\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:00 crc kubenswrapper[4766]: I1209 03:15:00.475363 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:03 crc kubenswrapper[4766]: E1209 03:15:03.732291 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 09 03:15:03 crc kubenswrapper[4766]: E1209 03:15:03.732963 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4smpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l2ll7_openshift-marketplace(b49b8e85-c4e7-4741-b61b-af26993717c2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 03:15:03 crc kubenswrapper[4766]: E1209 03:15:03.734474 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l2ll7" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" Dec 09 03:15:03 crc kubenswrapper[4766]: E1209 03:15:03.854414 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 09 03:15:03 crc kubenswrapper[4766]: E1209 03:15:03.854617 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2znl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zzft2_openshift-marketplace(2b4dc351-fc6c-478e-8695-e6e778b7be24): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 03:15:03 crc kubenswrapper[4766]: E1209 03:15:03.857068 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zzft2" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" Dec 09 03:15:06 crc kubenswrapper[4766]: I1209 03:15:06.956184 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2djrp" Dec 09 03:15:07 crc kubenswrapper[4766]: I1209 03:15:07.316932 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:15:07 crc kubenswrapper[4766]: I1209 03:15:07.317338 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:15:08 crc kubenswrapper[4766]: W1209 03:15:08.589417 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c9eb693_99eb_4b02_b33a_26d506eeb3f1.slice/crio-b451c013301d926cd5c510b8b0b985a0137f73c8ceca8a1d56020a0d5ed7ca72 WatchSource:0}: Error finding container b451c013301d926cd5c510b8b0b985a0137f73c8ceca8a1d56020a0d5ed7ca72: Status 404 returned error can't find the container with id b451c013301d926cd5c510b8b0b985a0137f73c8ceca8a1d56020a0d5ed7ca72 Dec 09 03:15:08 crc kubenswrapper[4766]: E1209 03:15:08.589770 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zzft2" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" Dec 09 03:15:08 crc kubenswrapper[4766]: E1209 03:15:08.590017 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l2ll7" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" Dec 09 03:15:08 crc kubenswrapper[4766]: E1209 03:15:08.674863 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 09 03:15:08 crc kubenswrapper[4766]: E1209 03:15:08.675016 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s52x9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cltm8_openshift-marketplace(28803301-4e5b-4b88-b04a-1dfd4eb1d903): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 03:15:08 crc kubenswrapper[4766]: E1209 03:15:08.676545 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cltm8" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" Dec 09 03:15:08 crc kubenswrapper[4766]: E1209 03:15:08.697368 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 09 03:15:08 crc kubenswrapper[4766]: E1209 03:15:08.697646 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8tmvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qnkzl_openshift-marketplace(9fee64f2-1434-4082-9674-088e4d93cb9a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 09 03:15:08 crc kubenswrapper[4766]: E1209 03:15:08.698755 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qnkzl" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" Dec 09 03:15:09 crc kubenswrapper[4766]: I1209 03:15:09.127420 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm"] Dec 09 03:15:09 crc kubenswrapper[4766]: W1209 03:15:09.160118 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c71c43_36b4_4147_b4c6_5e524c156343.slice/crio-5f1154cf8c8821d27006640775bf584e50c24865ac22a8d1eb3582f627fb4ea4 WatchSource:0}: Error finding container 5f1154cf8c8821d27006640775bf584e50c24865ac22a8d1eb3582f627fb4ea4: Status 404 returned error can't find the container with id 5f1154cf8c8821d27006640775bf584e50c24865ac22a8d1eb3582f627fb4ea4 Dec 09 03:15:09 crc kubenswrapper[4766]: I1209 03:15:09.281971 4766 generic.go:334] "Generic (PLEG): container finished" podID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerID="300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c" exitCode=0 Dec 09 03:15:09 crc kubenswrapper[4766]: I1209 03:15:09.282072 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxzck" event={"ID":"b2c59da0-4314-401f-9b47-cd3df45f4d26","Type":"ContainerDied","Data":"300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c"} Dec 09 03:15:09 crc kubenswrapper[4766]: I1209 03:15:09.283815 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" event={"ID":"35c71c43-36b4-4147-b4c6-5e524c156343","Type":"ContainerStarted","Data":"5f1154cf8c8821d27006640775bf584e50c24865ac22a8d1eb3582f627fb4ea4"} Dec 09 03:15:09 crc kubenswrapper[4766]: I1209 03:15:09.285394 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z6qth" event={"ID":"5c9eb693-99eb-4b02-b33a-26d506eeb3f1","Type":"ContainerStarted","Data":"9ad424189849eda7d916b4a9c304b8f09e8ec7402b32309fb84daac8e2539ddc"} Dec 09 03:15:09 crc kubenswrapper[4766]: I1209 03:15:09.285444 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z6qth" event={"ID":"5c9eb693-99eb-4b02-b33a-26d506eeb3f1","Type":"ContainerStarted","Data":"b451c013301d926cd5c510b8b0b985a0137f73c8ceca8a1d56020a0d5ed7ca72"} Dec 09 03:15:09 crc kubenswrapper[4766]: I1209 03:15:09.287507 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vt2b" event={"ID":"dd777685-39e7-46bb-824f-f19ceaa179ca","Type":"ContainerStarted","Data":"e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6"} Dec 09 03:15:09 crc kubenswrapper[4766]: I1209 03:15:09.293900 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhkk" event={"ID":"2f459552-2bcd-4b18-98eb-7e523f6d69d8","Type":"ContainerStarted","Data":"5a4894beda7c306eaac579b1dfc0ebcb4741a95420863cd479edd89d8d6fd2a2"} Dec 09 03:15:09 crc kubenswrapper[4766]: I1209 03:15:09.299925 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6mbs" event={"ID":"8a05d77b-68a0-4e96-b71e-f5168ee4d38f","Type":"ContainerStarted","Data":"9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c"} Dec 09 03:15:09 crc kubenswrapper[4766]: E1209 03:15:09.305074 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qnkzl" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" Dec 09 03:15:09 crc kubenswrapper[4766]: E1209 03:15:09.309075 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cltm8" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.310016 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z6qth" event={"ID":"5c9eb693-99eb-4b02-b33a-26d506eeb3f1","Type":"ContainerStarted","Data":"9625dacd8c7d6f72086401f838e9a215c020c6f149427d4547d0b441a6872615"} Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.313101 4766 generic.go:334] "Generic (PLEG): container finished" podID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerID="e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6" exitCode=0 Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.313198 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vt2b" event={"ID":"dd777685-39e7-46bb-824f-f19ceaa179ca","Type":"ContainerDied","Data":"e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6"} Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.315464 4766 generic.go:334] "Generic (PLEG): container finished" podID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerID="5a4894beda7c306eaac579b1dfc0ebcb4741a95420863cd479edd89d8d6fd2a2" exitCode=0 Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.315503 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhkk" event={"ID":"2f459552-2bcd-4b18-98eb-7e523f6d69d8","Type":"ContainerDied","Data":"5a4894beda7c306eaac579b1dfc0ebcb4741a95420863cd479edd89d8d6fd2a2"} Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.319408 4766 generic.go:334] "Generic (PLEG): container finished" podID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerID="9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c" exitCode=0 Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.319479 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6mbs" event={"ID":"8a05d77b-68a0-4e96-b71e-f5168ee4d38f","Type":"ContainerDied","Data":"9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c"} Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.327069 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxzck" event={"ID":"b2c59da0-4314-401f-9b47-cd3df45f4d26","Type":"ContainerStarted","Data":"7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90"} Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.329437 4766 generic.go:334] "Generic (PLEG): container finished" podID="35c71c43-36b4-4147-b4c6-5e524c156343" containerID="c3498cd8dcee4146efe99589ef7392db4990be9443acc8daa3a5909e8d9de0d4" exitCode=0 Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.329499 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" event={"ID":"35c71c43-36b4-4147-b4c6-5e524c156343","Type":"ContainerDied","Data":"c3498cd8dcee4146efe99589ef7392db4990be9443acc8daa3a5909e8d9de0d4"} Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.334266 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z6qth" podStartSLOduration=167.334243342 podStartE2EDuration="2m47.334243342s" podCreationTimestamp="2025-12-09 03:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:15:10.333628066 +0000 UTC m=+192.042933502" watchObservedRunningTime="2025-12-09 03:15:10.334243342 +0000 UTC m=+192.043548768" Dec 09 03:15:10 crc kubenswrapper[4766]: I1209 03:15:10.353966 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pxzck" podStartSLOduration=3.581088259 podStartE2EDuration="36.353941658s" podCreationTimestamp="2025-12-09 03:14:34 +0000 UTC" firstStartedPulling="2025-12-09 03:14:36.921416663 +0000 UTC m=+158.630722089" lastFinishedPulling="2025-12-09 03:15:09.694270052 +0000 UTC m=+191.403575488" observedRunningTime="2025-12-09 03:15:10.351137542 +0000 UTC m=+192.060442978" watchObservedRunningTime="2025-12-09 03:15:10.353941658 +0000 UTC m=+192.063247104" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.341363 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vt2b" event={"ID":"dd777685-39e7-46bb-824f-f19ceaa179ca","Type":"ContainerStarted","Data":"b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34"} Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.345859 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhkk" event={"ID":"2f459552-2bcd-4b18-98eb-7e523f6d69d8","Type":"ContainerStarted","Data":"ff4b3f1f1aa94f8b3509d2686b104a29ecf6de4ac713a3cac4ac1fe00d7bfc66"} Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.350170 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6mbs" event={"ID":"8a05d77b-68a0-4e96-b71e-f5168ee4d38f","Type":"ContainerStarted","Data":"c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6"} Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.391286 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6vt2b" podStartSLOduration=2.457889339 podStartE2EDuration="38.391204752s" podCreationTimestamp="2025-12-09 03:14:33 +0000 UTC" firstStartedPulling="2025-12-09 03:14:34.804465442 +0000 UTC m=+156.513770868" lastFinishedPulling="2025-12-09 03:15:10.737780855 +0000 UTC m=+192.447086281" observedRunningTime="2025-12-09 03:15:11.371524366 +0000 UTC m=+193.080829802" watchObservedRunningTime="2025-12-09 03:15:11.391204752 +0000 UTC m=+193.100510188" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.393571 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9bhkk" podStartSLOduration=2.43180874 podStartE2EDuration="36.393562315s" podCreationTimestamp="2025-12-09 03:14:35 +0000 UTC" firstStartedPulling="2025-12-09 03:14:36.883459681 +0000 UTC m=+158.592765107" lastFinishedPulling="2025-12-09 03:15:10.845213256 +0000 UTC m=+192.554518682" observedRunningTime="2025-12-09 03:15:11.390586225 +0000 UTC m=+193.099891651" watchObservedRunningTime="2025-12-09 03:15:11.393562315 +0000 UTC m=+193.102867741" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.746550 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.780631 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j6mbs" podStartSLOduration=3.843085314 podStartE2EDuration="39.78060305s" podCreationTimestamp="2025-12-09 03:14:32 +0000 UTC" firstStartedPulling="2025-12-09 03:14:34.791596653 +0000 UTC m=+156.500902079" lastFinishedPulling="2025-12-09 03:15:10.729114389 +0000 UTC m=+192.438419815" observedRunningTime="2025-12-09 03:15:11.424792785 +0000 UTC m=+193.134098231" watchObservedRunningTime="2025-12-09 03:15:11.78060305 +0000 UTC m=+193.489908476" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.815988 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c71c43-36b4-4147-b4c6-5e524c156343-secret-volume\") pod \"35c71c43-36b4-4147-b4c6-5e524c156343\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.816148 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c71c43-36b4-4147-b4c6-5e524c156343-config-volume\") pod \"35c71c43-36b4-4147-b4c6-5e524c156343\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.816178 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j2ck\" (UniqueName: \"kubernetes.io/projected/35c71c43-36b4-4147-b4c6-5e524c156343-kube-api-access-8j2ck\") pod \"35c71c43-36b4-4147-b4c6-5e524c156343\" (UID: \"35c71c43-36b4-4147-b4c6-5e524c156343\") " Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.817439 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c71c43-36b4-4147-b4c6-5e524c156343-config-volume" (OuterVolumeSpecName: "config-volume") pod "35c71c43-36b4-4147-b4c6-5e524c156343" (UID: "35c71c43-36b4-4147-b4c6-5e524c156343"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.825386 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c71c43-36b4-4147-b4c6-5e524c156343-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "35c71c43-36b4-4147-b4c6-5e524c156343" (UID: "35c71c43-36b4-4147-b4c6-5e524c156343"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.825491 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c71c43-36b4-4147-b4c6-5e524c156343-kube-api-access-8j2ck" (OuterVolumeSpecName: "kube-api-access-8j2ck") pod "35c71c43-36b4-4147-b4c6-5e524c156343" (UID: "35c71c43-36b4-4147-b4c6-5e524c156343"). InnerVolumeSpecName "kube-api-access-8j2ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.918000 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c71c43-36b4-4147-b4c6-5e524c156343-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.918061 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j2ck\" (UniqueName: \"kubernetes.io/projected/35c71c43-36b4-4147-b4c6-5e524c156343-kube-api-access-8j2ck\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:11 crc kubenswrapper[4766]: I1209 03:15:11.918080 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c71c43-36b4-4147-b4c6-5e524c156343-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.357110 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" event={"ID":"35c71c43-36b4-4147-b4c6-5e524c156343","Type":"ContainerDied","Data":"5f1154cf8c8821d27006640775bf584e50c24865ac22a8d1eb3582f627fb4ea4"} Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.357177 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f1154cf8c8821d27006640775bf584e50c24865ac22a8d1eb3582f627fb4ea4" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.357325 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.826621 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 03:15:12 crc kubenswrapper[4766]: E1209 03:15:12.826961 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c71c43-36b4-4147-b4c6-5e524c156343" containerName="collect-profiles" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.826987 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c71c43-36b4-4147-b4c6-5e524c156343" containerName="collect-profiles" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.827172 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c71c43-36b4-4147-b4c6-5e524c156343" containerName="collect-profiles" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.827813 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.830257 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.840299 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.847953 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.931697 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:12 crc kubenswrapper[4766]: I1209 03:15:12.931769 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.032840 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.032949 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.033087 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.059959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.143969 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.317332 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.317405 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.424672 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.561292 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.561357 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.582477 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 03:15:13 crc kubenswrapper[4766]: W1209 03:15:13.592432 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda02588bb_5742_4b3b_a0fc_87e962a4f98d.slice/crio-5cc80fa30d6f4d89c1fdc7942b6384f007b899b517f442d72c1cfefbf84201a7 WatchSource:0}: Error finding container 5cc80fa30d6f4d89c1fdc7942b6384f007b899b517f442d72c1cfefbf84201a7: Status 404 returned error can't find the container with id 5cc80fa30d6f4d89c1fdc7942b6384f007b899b517f442d72c1cfefbf84201a7 Dec 09 03:15:13 crc kubenswrapper[4766]: I1209 03:15:13.603989 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:15:14 crc kubenswrapper[4766]: I1209 03:15:14.370568 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a02588bb-5742-4b3b-a0fc-87e962a4f98d","Type":"ContainerStarted","Data":"aca15c7f650875c38eb0b2f7611ff18854493b736d3ccefca79466d997dfc525"} Dec 09 03:15:14 crc kubenswrapper[4766]: I1209 03:15:14.370661 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a02588bb-5742-4b3b-a0fc-87e962a4f98d","Type":"ContainerStarted","Data":"5cc80fa30d6f4d89c1fdc7942b6384f007b899b517f442d72c1cfefbf84201a7"} Dec 09 03:15:14 crc kubenswrapper[4766]: I1209 03:15:14.394140 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.394113942 podStartE2EDuration="2.394113942s" podCreationTimestamp="2025-12-09 03:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:15:14.392961981 +0000 UTC m=+196.102267407" watchObservedRunningTime="2025-12-09 03:15:14.394113942 +0000 UTC m=+196.103419368" Dec 09 03:15:15 crc kubenswrapper[4766]: I1209 03:15:15.286119 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:15:15 crc kubenswrapper[4766]: I1209 03:15:15.286716 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:15:15 crc kubenswrapper[4766]: I1209 03:15:15.341002 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:15:15 crc kubenswrapper[4766]: I1209 03:15:15.377547 4766 generic.go:334] "Generic (PLEG): container finished" podID="a02588bb-5742-4b3b-a0fc-87e962a4f98d" containerID="aca15c7f650875c38eb0b2f7611ff18854493b736d3ccefca79466d997dfc525" exitCode=0 Dec 09 03:15:15 crc kubenswrapper[4766]: I1209 03:15:15.377659 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a02588bb-5742-4b3b-a0fc-87e962a4f98d","Type":"ContainerDied","Data":"aca15c7f650875c38eb0b2f7611ff18854493b736d3ccefca79466d997dfc525"} Dec 09 03:15:15 crc kubenswrapper[4766]: I1209 03:15:15.422509 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:15:15 crc kubenswrapper[4766]: I1209 03:15:15.659397 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:15:15 crc kubenswrapper[4766]: I1209 03:15:15.659450 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:15:15 crc kubenswrapper[4766]: I1209 03:15:15.707453 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:15:16 crc kubenswrapper[4766]: I1209 03:15:16.431047 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:15:16 crc kubenswrapper[4766]: I1209 03:15:16.731850 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:16 crc kubenswrapper[4766]: I1209 03:15:16.787047 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kubelet-dir\") pod \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\" (UID: \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\") " Dec 09 03:15:16 crc kubenswrapper[4766]: I1209 03:15:16.787166 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a02588bb-5742-4b3b-a0fc-87e962a4f98d" (UID: "a02588bb-5742-4b3b-a0fc-87e962a4f98d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:15:16 crc kubenswrapper[4766]: I1209 03:15:16.787266 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kube-api-access\") pod \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\" (UID: \"a02588bb-5742-4b3b-a0fc-87e962a4f98d\") " Dec 09 03:15:16 crc kubenswrapper[4766]: I1209 03:15:16.787534 4766 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:16 crc kubenswrapper[4766]: I1209 03:15:16.802567 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a02588bb-5742-4b3b-a0fc-87e962a4f98d" (UID: "a02588bb-5742-4b3b-a0fc-87e962a4f98d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:15:16 crc kubenswrapper[4766]: I1209 03:15:16.889106 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a02588bb-5742-4b3b-a0fc-87e962a4f98d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:17 crc kubenswrapper[4766]: I1209 03:15:17.396562 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 03:15:17 crc kubenswrapper[4766]: I1209 03:15:17.396560 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a02588bb-5742-4b3b-a0fc-87e962a4f98d","Type":"ContainerDied","Data":"5cc80fa30d6f4d89c1fdc7942b6384f007b899b517f442d72c1cfefbf84201a7"} Dec 09 03:15:17 crc kubenswrapper[4766]: I1209 03:15:17.397305 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cc80fa30d6f4d89c1fdc7942b6384f007b899b517f442d72c1cfefbf84201a7" Dec 09 03:15:17 crc kubenswrapper[4766]: I1209 03:15:17.547581 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhkk"] Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.403560 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9bhkk" podUID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerName="registry-server" containerID="cri-o://ff4b3f1f1aa94f8b3509d2686b104a29ecf6de4ac713a3cac4ac1fe00d7bfc66" gracePeriod=2 Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.623714 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 03:15:18 crc kubenswrapper[4766]: E1209 03:15:18.624305 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02588bb-5742-4b3b-a0fc-87e962a4f98d" containerName="pruner" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.624326 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02588bb-5742-4b3b-a0fc-87e962a4f98d" containerName="pruner" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.624654 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02588bb-5742-4b3b-a0fc-87e962a4f98d" containerName="pruner" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.625745 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.638856 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.639189 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.659559 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.722944 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.723057 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4256c428-3231-47d2-9a7d-a78c2214988b-kube-api-access\") pod \"installer-9-crc\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.723094 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-var-lock\") pod \"installer-9-crc\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.826363 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4256c428-3231-47d2-9a7d-a78c2214988b-kube-api-access\") pod \"installer-9-crc\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.826499 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-var-lock\") pod \"installer-9-crc\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.826540 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.826657 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-var-lock\") pod \"installer-9-crc\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.827354 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.857609 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4256c428-3231-47d2-9a7d-a78c2214988b-kube-api-access\") pod \"installer-9-crc\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:18 crc kubenswrapper[4766]: I1209 03:15:18.963746 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.391889 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 03:15:19 crc kubenswrapper[4766]: W1209 03:15:19.404898 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4256c428_3231_47d2_9a7d_a78c2214988b.slice/crio-a71f821ca64246e7873c999092d4ada84d036a62c3959c4aa48b7f575806ecec WatchSource:0}: Error finding container a71f821ca64246e7873c999092d4ada84d036a62c3959c4aa48b7f575806ecec: Status 404 returned error can't find the container with id a71f821ca64246e7873c999092d4ada84d036a62c3959c4aa48b7f575806ecec Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.415949 4766 generic.go:334] "Generic (PLEG): container finished" podID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerID="ff4b3f1f1aa94f8b3509d2686b104a29ecf6de4ac713a3cac4ac1fe00d7bfc66" exitCode=0 Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.416011 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhkk" event={"ID":"2f459552-2bcd-4b18-98eb-7e523f6d69d8","Type":"ContainerDied","Data":"ff4b3f1f1aa94f8b3509d2686b104a29ecf6de4ac713a3cac4ac1fe00d7bfc66"} Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.416446 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bhkk" event={"ID":"2f459552-2bcd-4b18-98eb-7e523f6d69d8","Type":"ContainerDied","Data":"a7fbe9a4b82df2f0c408d5e5a483d705ef62fb3d9bc91bf33b37a275805d5ab9"} Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.416541 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7fbe9a4b82df2f0c408d5e5a483d705ef62fb3d9bc91bf33b37a275805d5ab9" Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.417588 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.547702 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-catalog-content\") pod \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.547802 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dlcp\" (UniqueName: \"kubernetes.io/projected/2f459552-2bcd-4b18-98eb-7e523f6d69d8-kube-api-access-6dlcp\") pod \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.547826 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-utilities\") pod \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\" (UID: \"2f459552-2bcd-4b18-98eb-7e523f6d69d8\") " Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.549471 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-utilities" (OuterVolumeSpecName: "utilities") pod "2f459552-2bcd-4b18-98eb-7e523f6d69d8" (UID: "2f459552-2bcd-4b18-98eb-7e523f6d69d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.563642 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f459552-2bcd-4b18-98eb-7e523f6d69d8-kube-api-access-6dlcp" (OuterVolumeSpecName: "kube-api-access-6dlcp") pod "2f459552-2bcd-4b18-98eb-7e523f6d69d8" (UID: "2f459552-2bcd-4b18-98eb-7e523f6d69d8"). InnerVolumeSpecName "kube-api-access-6dlcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.584487 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f459552-2bcd-4b18-98eb-7e523f6d69d8" (UID: "2f459552-2bcd-4b18-98eb-7e523f6d69d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.649237 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.649285 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dlcp\" (UniqueName: \"kubernetes.io/projected/2f459552-2bcd-4b18-98eb-7e523f6d69d8-kube-api-access-6dlcp\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:19 crc kubenswrapper[4766]: I1209 03:15:19.649299 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f459552-2bcd-4b18-98eb-7e523f6d69d8-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:20 crc kubenswrapper[4766]: I1209 03:15:20.424194 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4256c428-3231-47d2-9a7d-a78c2214988b","Type":"ContainerStarted","Data":"a71f821ca64246e7873c999092d4ada84d036a62c3959c4aa48b7f575806ecec"} Dec 09 03:15:20 crc kubenswrapper[4766]: I1209 03:15:20.424240 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bhkk" Dec 09 03:15:20 crc kubenswrapper[4766]: I1209 03:15:20.462284 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhkk"] Dec 09 03:15:20 crc kubenswrapper[4766]: I1209 03:15:20.465092 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bhkk"] Dec 09 03:15:20 crc kubenswrapper[4766]: I1209 03:15:20.848015 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" path="/var/lib/kubelet/pods/2f459552-2bcd-4b18-98eb-7e523f6d69d8/volumes" Dec 09 03:15:21 crc kubenswrapper[4766]: I1209 03:15:21.431188 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4256c428-3231-47d2-9a7d-a78c2214988b","Type":"ContainerStarted","Data":"11212f6bc8830825cffff75f6aeccfc29ea62635aebcd76d711bc698b7c5857a"} Dec 09 03:15:22 crc kubenswrapper[4766]: I1209 03:15:22.438725 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzft2" event={"ID":"2b4dc351-fc6c-478e-8695-e6e778b7be24","Type":"ContainerStarted","Data":"482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622"} Dec 09 03:15:22 crc kubenswrapper[4766]: I1209 03:15:22.460178 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.460159101 podStartE2EDuration="4.460159101s" podCreationTimestamp="2025-12-09 03:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:15:21.455565706 +0000 UTC m=+203.164871132" watchObservedRunningTime="2025-12-09 03:15:22.460159101 +0000 UTC m=+204.169464527" Dec 09 03:15:23 crc kubenswrapper[4766]: I1209 03:15:23.365611 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:15:23 crc kubenswrapper[4766]: I1209 03:15:23.445647 4766 generic.go:334] "Generic (PLEG): container finished" podID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerID="ce33ac8f05da1498f7d6b87974f98d12fe213ef140661819b0242fb0d3c80b3f" exitCode=0 Dec 09 03:15:23 crc kubenswrapper[4766]: I1209 03:15:23.445717 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2ll7" event={"ID":"b49b8e85-c4e7-4741-b61b-af26993717c2","Type":"ContainerDied","Data":"ce33ac8f05da1498f7d6b87974f98d12fe213ef140661819b0242fb0d3c80b3f"} Dec 09 03:15:23 crc kubenswrapper[4766]: I1209 03:15:23.450511 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerID="482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622" exitCode=0 Dec 09 03:15:23 crc kubenswrapper[4766]: I1209 03:15:23.450559 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzft2" event={"ID":"2b4dc351-fc6c-478e-8695-e6e778b7be24","Type":"ContainerDied","Data":"482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622"} Dec 09 03:15:23 crc kubenswrapper[4766]: I1209 03:15:23.453417 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cltm8" event={"ID":"28803301-4e5b-4b88-b04a-1dfd4eb1d903","Type":"ContainerStarted","Data":"3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae"} Dec 09 03:15:23 crc kubenswrapper[4766]: I1209 03:15:23.603816 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:15:24 crc kubenswrapper[4766]: I1209 03:15:24.464055 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2ll7" event={"ID":"b49b8e85-c4e7-4741-b61b-af26993717c2","Type":"ContainerStarted","Data":"97198fd4bdb699dbb440e0f98d152a828926af3b1d10161493c7eaf5d857e35c"} Dec 09 03:15:24 crc kubenswrapper[4766]: I1209 03:15:24.467244 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzft2" event={"ID":"2b4dc351-fc6c-478e-8695-e6e778b7be24","Type":"ContainerStarted","Data":"566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357"} Dec 09 03:15:24 crc kubenswrapper[4766]: I1209 03:15:24.470362 4766 generic.go:334] "Generic (PLEG): container finished" podID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerID="3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae" exitCode=0 Dec 09 03:15:24 crc kubenswrapper[4766]: I1209 03:15:24.470515 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cltm8" event={"ID":"28803301-4e5b-4b88-b04a-1dfd4eb1d903","Type":"ContainerDied","Data":"3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae"} Dec 09 03:15:24 crc kubenswrapper[4766]: I1209 03:15:24.474050 4766 generic.go:334] "Generic (PLEG): container finished" podID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerID="c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13" exitCode=0 Dec 09 03:15:24 crc kubenswrapper[4766]: I1209 03:15:24.474094 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnkzl" event={"ID":"9fee64f2-1434-4082-9674-088e4d93cb9a","Type":"ContainerDied","Data":"c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13"} Dec 09 03:15:24 crc kubenswrapper[4766]: I1209 03:15:24.483691 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l2ll7" podStartSLOduration=2.481640625 podStartE2EDuration="51.483671931s" podCreationTimestamp="2025-12-09 03:14:33 +0000 UTC" firstStartedPulling="2025-12-09 03:14:34.808889473 +0000 UTC m=+156.518194899" lastFinishedPulling="2025-12-09 03:15:23.810920779 +0000 UTC m=+205.520226205" observedRunningTime="2025-12-09 03:15:24.483534418 +0000 UTC m=+206.192839884" watchObservedRunningTime="2025-12-09 03:15:24.483671931 +0000 UTC m=+206.192977357" Dec 09 03:15:24 crc kubenswrapper[4766]: I1209 03:15:24.526189 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zzft2" podStartSLOduration=2.393098558 podStartE2EDuration="51.526169547s" podCreationTimestamp="2025-12-09 03:14:33 +0000 UTC" firstStartedPulling="2025-12-09 03:14:34.797788331 +0000 UTC m=+156.507093757" lastFinishedPulling="2025-12-09 03:15:23.93085932 +0000 UTC m=+205.640164746" observedRunningTime="2025-12-09 03:15:24.52080168 +0000 UTC m=+206.230107126" watchObservedRunningTime="2025-12-09 03:15:24.526169547 +0000 UTC m=+206.235474973" Dec 09 03:15:33 crc kubenswrapper[4766]: I1209 03:15:33.711160 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:15:33 crc kubenswrapper[4766]: I1209 03:15:33.711845 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:15:33 crc kubenswrapper[4766]: I1209 03:15:33.784998 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:15:33 crc kubenswrapper[4766]: I1209 03:15:33.920913 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:15:33 crc kubenswrapper[4766]: I1209 03:15:33.920951 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:15:34 crc kubenswrapper[4766]: I1209 03:15:34.195257 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:15:34 crc kubenswrapper[4766]: I1209 03:15:34.598101 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:15:34 crc kubenswrapper[4766]: I1209 03:15:34.598631 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:15:36 crc kubenswrapper[4766]: I1209 03:15:36.392810 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzft2"] Dec 09 03:15:36 crc kubenswrapper[4766]: I1209 03:15:36.556072 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zzft2" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerName="registry-server" containerID="cri-o://566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357" gracePeriod=2 Dec 09 03:15:36 crc kubenswrapper[4766]: I1209 03:15:36.965023 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:15:36 crc kubenswrapper[4766]: I1209 03:15:36.983153 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2ll7"] Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.136165 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-catalog-content\") pod \"2b4dc351-fc6c-478e-8695-e6e778b7be24\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.147436 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-utilities\") pod \"2b4dc351-fc6c-478e-8695-e6e778b7be24\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.147518 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2znl\" (UniqueName: \"kubernetes.io/projected/2b4dc351-fc6c-478e-8695-e6e778b7be24-kube-api-access-m2znl\") pod \"2b4dc351-fc6c-478e-8695-e6e778b7be24\" (UID: \"2b4dc351-fc6c-478e-8695-e6e778b7be24\") " Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.149292 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-utilities" (OuterVolumeSpecName: "utilities") pod "2b4dc351-fc6c-478e-8695-e6e778b7be24" (UID: "2b4dc351-fc6c-478e-8695-e6e778b7be24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.154227 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4dc351-fc6c-478e-8695-e6e778b7be24-kube-api-access-m2znl" (OuterVolumeSpecName: "kube-api-access-m2znl") pod "2b4dc351-fc6c-478e-8695-e6e778b7be24" (UID: "2b4dc351-fc6c-478e-8695-e6e778b7be24"). InnerVolumeSpecName "kube-api-access-m2znl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.204539 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b4dc351-fc6c-478e-8695-e6e778b7be24" (UID: "2b4dc351-fc6c-478e-8695-e6e778b7be24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.249880 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.249923 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4dc351-fc6c-478e-8695-e6e778b7be24-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.249956 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2znl\" (UniqueName: \"kubernetes.io/projected/2b4dc351-fc6c-478e-8695-e6e778b7be24-kube-api-access-m2znl\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.316625 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.317018 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.317068 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.317673 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.317788 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e" gracePeriod=600 Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.564141 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerID="566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357" exitCode=0 Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.564193 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzft2" event={"ID":"2b4dc351-fc6c-478e-8695-e6e778b7be24","Type":"ContainerDied","Data":"566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357"} Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.564252 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzft2" event={"ID":"2b4dc351-fc6c-478e-8695-e6e778b7be24","Type":"ContainerDied","Data":"c3a45893c01effe86ad73ab0ea23c2e5152a8b385cbadf773318da227a69208b"} Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.564272 4766 scope.go:117] "RemoveContainer" containerID="566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.564353 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzft2" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.566623 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cltm8" event={"ID":"28803301-4e5b-4b88-b04a-1dfd4eb1d903","Type":"ContainerStarted","Data":"30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43"} Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.570862 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnkzl" event={"ID":"9fee64f2-1434-4082-9674-088e4d93cb9a","Type":"ContainerStarted","Data":"3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e"} Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.573602 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e"} Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.573554 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e" exitCode=0 Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.574004 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l2ll7" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerName="registry-server" containerID="cri-o://97198fd4bdb699dbb440e0f98d152a828926af3b1d10161493c7eaf5d857e35c" gracePeriod=2 Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.597851 4766 scope.go:117] "RemoveContainer" containerID="482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.617105 4766 scope.go:117] "RemoveContainer" containerID="08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.625161 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cltm8" podStartSLOduration=3.189404672 podStartE2EDuration="1m1.625139184s" podCreationTimestamp="2025-12-09 03:14:36 +0000 UTC" firstStartedPulling="2025-12-09 03:14:37.943255918 +0000 UTC m=+159.652561344" lastFinishedPulling="2025-12-09 03:15:36.37899039 +0000 UTC m=+218.088295856" observedRunningTime="2025-12-09 03:15:37.593494134 +0000 UTC m=+219.302799560" watchObservedRunningTime="2025-12-09 03:15:37.625139184 +0000 UTC m=+219.334444600" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.625875 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qnkzl" podStartSLOduration=3.227545046 podStartE2EDuration="1m1.625865003s" podCreationTimestamp="2025-12-09 03:14:36 +0000 UTC" firstStartedPulling="2025-12-09 03:14:37.964951717 +0000 UTC m=+159.674257143" lastFinishedPulling="2025-12-09 03:15:36.363271674 +0000 UTC m=+218.072577100" observedRunningTime="2025-12-09 03:15:37.624787144 +0000 UTC m=+219.334092580" watchObservedRunningTime="2025-12-09 03:15:37.625865003 +0000 UTC m=+219.335170429" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.637916 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzft2"] Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.641637 4766 scope.go:117] "RemoveContainer" containerID="566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.645135 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zzft2"] Dec 09 03:15:37 crc kubenswrapper[4766]: E1209 03:15:37.646127 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357\": container with ID starting with 566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357 not found: ID does not exist" containerID="566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.646303 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357"} err="failed to get container status \"566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357\": rpc error: code = NotFound desc = could not find container \"566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357\": container with ID starting with 566b6296a2ebbbb6cfe654aa0f7d76f36e672b63320ef7f8aa22a09ef3979357 not found: ID does not exist" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.646474 4766 scope.go:117] "RemoveContainer" containerID="482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622" Dec 09 03:15:37 crc kubenswrapper[4766]: E1209 03:15:37.648417 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622\": container with ID starting with 482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622 not found: ID does not exist" containerID="482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.648562 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622"} err="failed to get container status \"482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622\": rpc error: code = NotFound desc = could not find container \"482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622\": container with ID starting with 482a92ea6d7d40a8555d2b05588ed1710618c46ebdde563b08c0b8e809aef622 not found: ID does not exist" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.648656 4766 scope.go:117] "RemoveContainer" containerID="08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb" Dec 09 03:15:37 crc kubenswrapper[4766]: E1209 03:15:37.649061 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb\": container with ID starting with 08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb not found: ID does not exist" containerID="08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb" Dec 09 03:15:37 crc kubenswrapper[4766]: I1209 03:15:37.649089 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb"} err="failed to get container status \"08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb\": rpc error: code = NotFound desc = could not find container \"08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb\": container with ID starting with 08a90221eabdf62f98f65caa44bd62733878231b8d43c9b3f4928013d9afdfdb not found: ID does not exist" Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.587023 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"529f1759cbae2ffe7af74a6b51baacdf0d487f623ff2e9c4f8dd43b5b64d87b4"} Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.601758 4766 generic.go:334] "Generic (PLEG): container finished" podID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerID="97198fd4bdb699dbb440e0f98d152a828926af3b1d10161493c7eaf5d857e35c" exitCode=0 Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.601824 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2ll7" event={"ID":"b49b8e85-c4e7-4741-b61b-af26993717c2","Type":"ContainerDied","Data":"97198fd4bdb699dbb440e0f98d152a828926af3b1d10161493c7eaf5d857e35c"} Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.679597 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.765183 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4smpj\" (UniqueName: \"kubernetes.io/projected/b49b8e85-c4e7-4741-b61b-af26993717c2-kube-api-access-4smpj\") pod \"b49b8e85-c4e7-4741-b61b-af26993717c2\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.765266 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-catalog-content\") pod \"b49b8e85-c4e7-4741-b61b-af26993717c2\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.765323 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-utilities\") pod \"b49b8e85-c4e7-4741-b61b-af26993717c2\" (UID: \"b49b8e85-c4e7-4741-b61b-af26993717c2\") " Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.766256 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-utilities" (OuterVolumeSpecName: "utilities") pod "b49b8e85-c4e7-4741-b61b-af26993717c2" (UID: "b49b8e85-c4e7-4741-b61b-af26993717c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.773387 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49b8e85-c4e7-4741-b61b-af26993717c2-kube-api-access-4smpj" (OuterVolumeSpecName: "kube-api-access-4smpj") pod "b49b8e85-c4e7-4741-b61b-af26993717c2" (UID: "b49b8e85-c4e7-4741-b61b-af26993717c2"). InnerVolumeSpecName "kube-api-access-4smpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.849064 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b49b8e85-c4e7-4741-b61b-af26993717c2" (UID: "b49b8e85-c4e7-4741-b61b-af26993717c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.849328 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" path="/var/lib/kubelet/pods/2b4dc351-fc6c-478e-8695-e6e778b7be24/volumes" Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.866938 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4smpj\" (UniqueName: \"kubernetes.io/projected/b49b8e85-c4e7-4741-b61b-af26993717c2-kube-api-access-4smpj\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.866979 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:38 crc kubenswrapper[4766]: I1209 03:15:38.866992 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b49b8e85-c4e7-4741-b61b-af26993717c2-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:39 crc kubenswrapper[4766]: I1209 03:15:39.614199 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2ll7" event={"ID":"b49b8e85-c4e7-4741-b61b-af26993717c2","Type":"ContainerDied","Data":"442ca3064c0376513aa35143db734c2e4abf3c41c2258d7058fa605c71ada9c2"} Dec 09 03:15:39 crc kubenswrapper[4766]: I1209 03:15:39.614305 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2ll7" Dec 09 03:15:39 crc kubenswrapper[4766]: I1209 03:15:39.615578 4766 scope.go:117] "RemoveContainer" containerID="97198fd4bdb699dbb440e0f98d152a828926af3b1d10161493c7eaf5d857e35c" Dec 09 03:15:39 crc kubenswrapper[4766]: I1209 03:15:39.635806 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2ll7"] Dec 09 03:15:39 crc kubenswrapper[4766]: I1209 03:15:39.637261 4766 scope.go:117] "RemoveContainer" containerID="ce33ac8f05da1498f7d6b87974f98d12fe213ef140661819b0242fb0d3c80b3f" Dec 09 03:15:39 crc kubenswrapper[4766]: I1209 03:15:39.639409 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l2ll7"] Dec 09 03:15:39 crc kubenswrapper[4766]: I1209 03:15:39.666493 4766 scope.go:117] "RemoveContainer" containerID="df807debd83b0d0876f0c7fe0fb95a759bfeb552bcee7a90e8d3a6b2e3b347d5" Dec 09 03:15:40 crc kubenswrapper[4766]: I1209 03:15:40.847438 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" path="/var/lib/kubelet/pods/b49b8e85-c4e7-4741-b61b-af26993717c2/volumes" Dec 09 03:15:45 crc kubenswrapper[4766]: I1209 03:15:45.570314 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxqs4"] Dec 09 03:15:46 crc kubenswrapper[4766]: I1209 03:15:46.539512 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:15:46 crc kubenswrapper[4766]: I1209 03:15:46.539588 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:15:46 crc kubenswrapper[4766]: I1209 03:15:46.590874 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:15:46 crc kubenswrapper[4766]: I1209 03:15:46.723248 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:15:46 crc kubenswrapper[4766]: I1209 03:15:46.922955 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:15:46 crc kubenswrapper[4766]: I1209 03:15:46.923094 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:15:46 crc kubenswrapper[4766]: I1209 03:15:46.978079 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:15:47 crc kubenswrapper[4766]: I1209 03:15:47.716410 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:15:48 crc kubenswrapper[4766]: I1209 03:15:48.986433 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cltm8"] Dec 09 03:15:49 crc kubenswrapper[4766]: I1209 03:15:49.685008 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cltm8" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerName="registry-server" containerID="cri-o://30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43" gracePeriod=2 Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.099860 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.153531 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-utilities\") pod \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.153637 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-catalog-content\") pod \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.153699 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/28803301-4e5b-4b88-b04a-1dfd4eb1d903-kube-api-access-s52x9\") pod \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\" (UID: \"28803301-4e5b-4b88-b04a-1dfd4eb1d903\") " Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.155140 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-utilities" (OuterVolumeSpecName: "utilities") pod "28803301-4e5b-4b88-b04a-1dfd4eb1d903" (UID: "28803301-4e5b-4b88-b04a-1dfd4eb1d903"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.161633 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28803301-4e5b-4b88-b04a-1dfd4eb1d903-kube-api-access-s52x9" (OuterVolumeSpecName: "kube-api-access-s52x9") pod "28803301-4e5b-4b88-b04a-1dfd4eb1d903" (UID: "28803301-4e5b-4b88-b04a-1dfd4eb1d903"). InnerVolumeSpecName "kube-api-access-s52x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.255143 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28803301-4e5b-4b88-b04a-1dfd4eb1d903" (UID: "28803301-4e5b-4b88-b04a-1dfd4eb1d903"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.255750 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.255785 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28803301-4e5b-4b88-b04a-1dfd4eb1d903-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.255800 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s52x9\" (UniqueName: \"kubernetes.io/projected/28803301-4e5b-4b88-b04a-1dfd4eb1d903-kube-api-access-s52x9\") on node \"crc\" DevicePath \"\"" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.693444 4766 generic.go:334] "Generic (PLEG): container finished" podID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerID="30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43" exitCode=0 Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.693509 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cltm8" event={"ID":"28803301-4e5b-4b88-b04a-1dfd4eb1d903","Type":"ContainerDied","Data":"30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43"} Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.693579 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cltm8" event={"ID":"28803301-4e5b-4b88-b04a-1dfd4eb1d903","Type":"ContainerDied","Data":"473f6fd500ff828a557cc45df7de003f33d82755e09c6fd4ecbccb32207d0e5e"} Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.693611 4766 scope.go:117] "RemoveContainer" containerID="30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.693655 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cltm8" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.723201 4766 scope.go:117] "RemoveContainer" containerID="3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.742533 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cltm8"] Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.750457 4766 scope.go:117] "RemoveContainer" containerID="c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.757846 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cltm8"] Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.781989 4766 scope.go:117] "RemoveContainer" containerID="30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43" Dec 09 03:15:50 crc kubenswrapper[4766]: E1209 03:15:50.782723 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43\": container with ID starting with 30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43 not found: ID does not exist" containerID="30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.782809 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43"} err="failed to get container status \"30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43\": rpc error: code = NotFound desc = could not find container \"30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43\": container with ID starting with 30eebfc624a22bee26c16a040bf498be6d35262da9e2b2bcb9571c4c1bbf7b43 not found: ID does not exist" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.782856 4766 scope.go:117] "RemoveContainer" containerID="3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae" Dec 09 03:15:50 crc kubenswrapper[4766]: E1209 03:15:50.783485 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae\": container with ID starting with 3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae not found: ID does not exist" containerID="3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.783565 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae"} err="failed to get container status \"3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae\": rpc error: code = NotFound desc = could not find container \"3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae\": container with ID starting with 3fcd16d9cc711d26c546297d58efd02f0b52bde81d818456b58e15e1c4c05bae not found: ID does not exist" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.783597 4766 scope.go:117] "RemoveContainer" containerID="c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d" Dec 09 03:15:50 crc kubenswrapper[4766]: E1209 03:15:50.783964 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d\": container with ID starting with c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d not found: ID does not exist" containerID="c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.784004 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d"} err="failed to get container status \"c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d\": rpc error: code = NotFound desc = could not find container \"c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d\": container with ID starting with c678f2343f756ed20cad3496d5c93c5df5f659d43d4a049f7fc209904da1ae1d not found: ID does not exist" Dec 09 03:15:50 crc kubenswrapper[4766]: I1209 03:15:50.846979 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" path="/var/lib/kubelet/pods/28803301-4e5b-4b88-b04a-1dfd4eb1d903/volumes" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.477076 4766 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478140 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376" gracePeriod=15 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478154 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e" gracePeriod=15 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478326 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792" gracePeriod=15 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478361 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca" gracePeriod=15 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478470 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976" gracePeriod=15 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478600 4766 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.478893 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478911 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.478927 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478937 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.478952 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478962 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.478972 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.478982 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.478997 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerName="extract-utilities" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479006 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerName="extract-utilities" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479018 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479026 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479038 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerName="extract-utilities" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479047 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerName="extract-utilities" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479056 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479064 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479077 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerName="extract-content" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479085 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerName="extract-content" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479100 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerName="extract-content" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479112 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerName="extract-content" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479123 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerName="extract-utilities" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479132 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerName="extract-utilities" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479143 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerName="extract-content" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479151 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerName="extract-content" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479165 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479174 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479187 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479198 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479235 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479247 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479261 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerName="extract-content" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479270 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerName="extract-content" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479286 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479295 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479308 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479317 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: E1209 03:15:58.479331 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerName="extract-utilities" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479340 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerName="extract-utilities" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479477 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="28803301-4e5b-4b88-b04a-1dfd4eb1d903" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479527 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479539 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f459552-2bcd-4b18-98eb-7e523f6d69d8" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479601 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479613 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4dc351-fc6c-478e-8695-e6e778b7be24" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479631 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479642 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479667 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49b8e85-c4e7-4741-b61b-af26993717c2" containerName="registry-server" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479678 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.479957 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.481285 4766 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.481963 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.487803 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.614251 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.614316 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.614338 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.614400 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.614418 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.614442 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.614460 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.614479 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715531 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715579 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715614 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715633 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715658 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715677 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715698 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715701 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715729 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715728 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715766 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715773 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715772 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715823 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715850 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.715899 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.746148 4766 generic.go:334] "Generic (PLEG): container finished" podID="4256c428-3231-47d2-9a7d-a78c2214988b" containerID="11212f6bc8830825cffff75f6aeccfc29ea62635aebcd76d711bc698b7c5857a" exitCode=0 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.746245 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4256c428-3231-47d2-9a7d-a78c2214988b","Type":"ContainerDied","Data":"11212f6bc8830825cffff75f6aeccfc29ea62635aebcd76d711bc698b7c5857a"} Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.747447 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.749898 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.751769 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.752882 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e" exitCode=0 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.752910 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792" exitCode=0 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.752917 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca" exitCode=0 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.752926 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976" exitCode=2 Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.752964 4766 scope.go:117] "RemoveContainer" containerID="36043624b462a6aa5ef71d7f2fdeb26894699959b222dd39ca22a0162e11de61" Dec 09 03:15:58 crc kubenswrapper[4766]: I1209 03:15:58.851723 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:15:59 crc kubenswrapper[4766]: I1209 03:15:59.766409 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.051929 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.052945 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.136308 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-kubelet-dir\") pod \"4256c428-3231-47d2-9a7d-a78c2214988b\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.136414 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4256c428-3231-47d2-9a7d-a78c2214988b" (UID: "4256c428-3231-47d2-9a7d-a78c2214988b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.136439 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-var-lock\") pod \"4256c428-3231-47d2-9a7d-a78c2214988b\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.136481 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-var-lock" (OuterVolumeSpecName: "var-lock") pod "4256c428-3231-47d2-9a7d-a78c2214988b" (UID: "4256c428-3231-47d2-9a7d-a78c2214988b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.136543 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4256c428-3231-47d2-9a7d-a78c2214988b-kube-api-access\") pod \"4256c428-3231-47d2-9a7d-a78c2214988b\" (UID: \"4256c428-3231-47d2-9a7d-a78c2214988b\") " Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.137103 4766 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.137148 4766 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4256c428-3231-47d2-9a7d-a78c2214988b-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.146208 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4256c428-3231-47d2-9a7d-a78c2214988b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4256c428-3231-47d2-9a7d-a78c2214988b" (UID: "4256c428-3231-47d2-9a7d-a78c2214988b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.238820 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4256c428-3231-47d2-9a7d-a78c2214988b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.774641 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4256c428-3231-47d2-9a7d-a78c2214988b","Type":"ContainerDied","Data":"a71f821ca64246e7873c999092d4ada84d036a62c3959c4aa48b7f575806ecec"} Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.774923 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a71f821ca64246e7873c999092d4ada84d036a62c3959c4aa48b7f575806ecec" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.774684 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.878428 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.882148 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.882828 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.883453 4766 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:00 crc kubenswrapper[4766]: I1209 03:16:00.883837 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.047353 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.047400 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.047471 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.047619 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.047690 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.047745 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.048150 4766 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.048243 4766 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.048289 4766 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.785072 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.786921 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376" exitCode=0 Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.787013 4766 scope.go:117] "RemoveContainer" containerID="d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.787111 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.808947 4766 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.809627 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.815883 4766 scope.go:117] "RemoveContainer" containerID="dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.836136 4766 scope.go:117] "RemoveContainer" containerID="8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.856282 4766 scope.go:117] "RemoveContainer" containerID="3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.872968 4766 scope.go:117] "RemoveContainer" containerID="10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.888499 4766 scope.go:117] "RemoveContainer" containerID="f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.915308 4766 scope.go:117] "RemoveContainer" containerID="d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e" Dec 09 03:16:01 crc kubenswrapper[4766]: E1209 03:16:01.916355 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\": container with ID starting with d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e not found: ID does not exist" containerID="d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.916398 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e"} err="failed to get container status \"d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\": rpc error: code = NotFound desc = could not find container \"d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e\": container with ID starting with d53321e5338f7918529deab4c70a3c34f035d024d389d8b21bec67c7ab1eb21e not found: ID does not exist" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.916420 4766 scope.go:117] "RemoveContainer" containerID="dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792" Dec 09 03:16:01 crc kubenswrapper[4766]: E1209 03:16:01.916664 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\": container with ID starting with dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792 not found: ID does not exist" containerID="dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.916698 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792"} err="failed to get container status \"dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\": rpc error: code = NotFound desc = could not find container \"dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792\": container with ID starting with dd0bfeaf799605a296eb92b6e00354d10bba19dcda1ac944b428838c3121b792 not found: ID does not exist" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.916714 4766 scope.go:117] "RemoveContainer" containerID="8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca" Dec 09 03:16:01 crc kubenswrapper[4766]: E1209 03:16:01.916909 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\": container with ID starting with 8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca not found: ID does not exist" containerID="8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.916929 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca"} err="failed to get container status \"8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\": rpc error: code = NotFound desc = could not find container \"8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca\": container with ID starting with 8a99638b9e3c052c06615d548ab863dc19dfda9c8cc109bb5494331bf967f8ca not found: ID does not exist" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.916943 4766 scope.go:117] "RemoveContainer" containerID="3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976" Dec 09 03:16:01 crc kubenswrapper[4766]: E1209 03:16:01.917261 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\": container with ID starting with 3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976 not found: ID does not exist" containerID="3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.917302 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976"} err="failed to get container status \"3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\": rpc error: code = NotFound desc = could not find container \"3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976\": container with ID starting with 3f7352ef6579309247f7c67f99085e55751084141723eff6cb5a98584107c976 not found: ID does not exist" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.917314 4766 scope.go:117] "RemoveContainer" containerID="10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376" Dec 09 03:16:01 crc kubenswrapper[4766]: E1209 03:16:01.918703 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\": container with ID starting with 10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376 not found: ID does not exist" containerID="10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.918731 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376"} err="failed to get container status \"10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\": rpc error: code = NotFound desc = could not find container \"10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376\": container with ID starting with 10b4029399669f0ae5e3cdd5e06be3b6090e0d99b21ce36950c22bc6e71d6376 not found: ID does not exist" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.918746 4766 scope.go:117] "RemoveContainer" containerID="f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350" Dec 09 03:16:01 crc kubenswrapper[4766]: E1209 03:16:01.919478 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\": container with ID starting with f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350 not found: ID does not exist" containerID="f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350" Dec 09 03:16:01 crc kubenswrapper[4766]: I1209 03:16:01.919503 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350"} err="failed to get container status \"f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\": rpc error: code = NotFound desc = could not find container \"f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350\": container with ID starting with f4dd968021a8eac5663d08cd338ad4038eceda66a129a4e3998c30c8ee25a350 not found: ID does not exist" Dec 09 03:16:02 crc kubenswrapper[4766]: I1209 03:16:02.849841 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 09 03:16:03 crc kubenswrapper[4766]: E1209 03:16:03.525907 4766 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:16:03 crc kubenswrapper[4766]: I1209 03:16:03.526673 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:16:03 crc kubenswrapper[4766]: E1209 03:16:03.548191 4766 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f6dabc9f4073a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 03:16:03.547572026 +0000 UTC m=+245.256877452,LastTimestamp:2025-12-09 03:16:03.547572026 +0000 UTC m=+245.256877452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 03:16:03 crc kubenswrapper[4766]: I1209 03:16:03.803719 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f"} Dec 09 03:16:03 crc kubenswrapper[4766]: I1209 03:16:03.804293 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6a02c69f07a530923172285325ed3f4a024d3a26a99d85725fa7d05a3573c695"} Dec 09 03:16:03 crc kubenswrapper[4766]: I1209 03:16:03.804994 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:03 crc kubenswrapper[4766]: E1209 03:16:03.805039 4766 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:16:05 crc kubenswrapper[4766]: E1209 03:16:05.168398 4766 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:05 crc kubenswrapper[4766]: E1209 03:16:05.170415 4766 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:05 crc kubenswrapper[4766]: E1209 03:16:05.170712 4766 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:05 crc kubenswrapper[4766]: E1209 03:16:05.170978 4766 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:05 crc kubenswrapper[4766]: E1209 03:16:05.171260 4766 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:05 crc kubenswrapper[4766]: I1209 03:16:05.171293 4766 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 09 03:16:05 crc kubenswrapper[4766]: E1209 03:16:05.171502 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="200ms" Dec 09 03:16:05 crc kubenswrapper[4766]: E1209 03:16:05.372827 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="400ms" Dec 09 03:16:05 crc kubenswrapper[4766]: E1209 03:16:05.774429 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="800ms" Dec 09 03:16:06 crc kubenswrapper[4766]: E1209 03:16:06.578325 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="1.6s" Dec 09 03:16:08 crc kubenswrapper[4766]: E1209 03:16:08.179583 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="3.2s" Dec 09 03:16:08 crc kubenswrapper[4766]: I1209 03:16:08.842176 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:09 crc kubenswrapper[4766]: I1209 03:16:09.838949 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:09 crc kubenswrapper[4766]: I1209 03:16:09.840326 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:09 crc kubenswrapper[4766]: E1209 03:16:09.840345 4766 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f6dabc9f4073a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 03:16:03.547572026 +0000 UTC m=+245.256877452,LastTimestamp:2025-12-09 03:16:03.547572026 +0000 UTC m=+245.256877452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 03:16:09 crc kubenswrapper[4766]: I1209 03:16:09.858519 4766 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:09 crc kubenswrapper[4766]: I1209 03:16:09.858571 4766 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:09 crc kubenswrapper[4766]: E1209 03:16:09.859290 4766 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:09 crc kubenswrapper[4766]: I1209 03:16:09.860435 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:10 crc kubenswrapper[4766]: I1209 03:16:10.603503 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" podUID="2f2facf5-7977-44e9-beea-141276d212a5" containerName="oauth-openshift" containerID="cri-o://464d3f552bf4e3b2c2b7119d46c4b66a54935d662874ad750e3ca9fe8956410a" gracePeriod=15 Dec 09 03:16:10 crc kubenswrapper[4766]: I1209 03:16:10.853567 4766 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9f117a0dbfd573277fa691448aa22669c6cf6847d68d33f0594eb4a54b96d32d" exitCode=0 Dec 09 03:16:10 crc kubenswrapper[4766]: I1209 03:16:10.853649 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"9f117a0dbfd573277fa691448aa22669c6cf6847d68d33f0594eb4a54b96d32d"} Dec 09 03:16:10 crc kubenswrapper[4766]: I1209 03:16:10.854072 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7dd662eb6c063c70bfd6872f4312b11dbf557c4b8a3012d07f23346692ccb248"} Dec 09 03:16:10 crc kubenswrapper[4766]: I1209 03:16:10.854521 4766 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:10 crc kubenswrapper[4766]: I1209 03:16:10.854540 4766 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:10 crc kubenswrapper[4766]: I1209 03:16:10.855524 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:10 crc kubenswrapper[4766]: E1209 03:16:10.855898 4766 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:10 crc kubenswrapper[4766]: I1209 03:16:10.856171 4766 generic.go:334] "Generic (PLEG): container finished" podID="2f2facf5-7977-44e9-beea-141276d212a5" containerID="464d3f552bf4e3b2c2b7119d46c4b66a54935d662874ad750e3ca9fe8956410a" exitCode=0 Dec 09 03:16:10 crc kubenswrapper[4766]: I1209 03:16:10.856310 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" event={"ID":"2f2facf5-7977-44e9-beea-141276d212a5","Type":"ContainerDied","Data":"464d3f552bf4e3b2c2b7119d46c4b66a54935d662874ad750e3ca9fe8956410a"} Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.032857 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.033770 4766 status_manager.go:851] "Failed to get status for pod" podUID="2f2facf5-7977-44e9-beea-141276d212a5" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fxqs4\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.034190 4766 status_manager.go:851] "Failed to get status for pod" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.201723 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f2facf5-7977-44e9-beea-141276d212a5-audit-dir\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.201821 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-session\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.201854 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-router-certs\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.201854 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f2facf5-7977-44e9-beea-141276d212a5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.201916 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l7zt\" (UniqueName: \"kubernetes.io/projected/2f2facf5-7977-44e9-beea-141276d212a5-kube-api-access-2l7zt\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.201956 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-serving-cert\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.201984 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-ocp-branding-template\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.202007 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-provider-selection\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.202028 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-error\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.202048 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-service-ca\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.202066 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-trusted-ca-bundle\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.202096 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-idp-0-file-data\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.202118 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-cliconfig\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.202151 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-login\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.202168 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-audit-policies\") pod \"2f2facf5-7977-44e9-beea-141276d212a5\" (UID: \"2f2facf5-7977-44e9-beea-141276d212a5\") " Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.202388 4766 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2f2facf5-7977-44e9-beea-141276d212a5-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.203800 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.203814 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.204344 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.205057 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.210330 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.210431 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2facf5-7977-44e9-beea-141276d212a5-kube-api-access-2l7zt" (OuterVolumeSpecName: "kube-api-access-2l7zt") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "kube-api-access-2l7zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.210594 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.211694 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.212048 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.212378 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.212612 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.213477 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.213767 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2f2facf5-7977-44e9-beea-141276d212a5" (UID: "2f2facf5-7977-44e9-beea-141276d212a5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.303790 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l7zt\" (UniqueName: \"kubernetes.io/projected/2f2facf5-7977-44e9-beea-141276d212a5-kube-api-access-2l7zt\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.304347 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.304366 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.304381 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.304398 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.304411 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.304424 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.305714 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.305847 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.305862 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.305879 4766 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2f2facf5-7977-44e9-beea-141276d212a5-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.305893 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.305906 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2f2facf5-7977-44e9-beea-141276d212a5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.864958 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.865002 4766 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df" exitCode=1 Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.865050 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df"} Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.865507 4766 scope.go:117] "RemoveContainer" containerID="d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.869415 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" event={"ID":"2f2facf5-7977-44e9-beea-141276d212a5","Type":"ContainerDied","Data":"60199c4b1dd0ad177ebadd1f95967d79953ec307758b6bce22b989f98b55889d"} Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.869476 4766 scope.go:117] "RemoveContainer" containerID="464d3f552bf4e3b2c2b7119d46c4b66a54935d662874ad750e3ca9fe8956410a" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.869626 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxqs4" Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.886042 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e11c16c3e422cc91d742f32ecc965db7c67c9f54f64abd141dc75c1b460147eb"} Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.886096 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3687e0b2f15749734655482105936b3bcb688e43281cf06c1969c2f5df31ff57"} Dec 09 03:16:11 crc kubenswrapper[4766]: I1209 03:16:11.886107 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a02d78849dc96ecef5314fe46f3d29baa3ea291ecdf21228076b999a6bd0fccc"} Dec 09 03:16:12 crc kubenswrapper[4766]: I1209 03:16:12.896864 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 03:16:12 crc kubenswrapper[4766]: I1209 03:16:12.897369 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"49ff95970c00b7e1bfc02dbcc9d587d91db93912cbe67559418d50257b0e8db5"} Dec 09 03:16:12 crc kubenswrapper[4766]: I1209 03:16:12.902251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c4d4d025dc4a0feaf12757863022e2e73935f7a79c045f4afaaa76f701e89b96"} Dec 09 03:16:12 crc kubenswrapper[4766]: I1209 03:16:12.902306 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d4e390e60573eeaa96bd3cd6a8990a9dbd10d38f194a6b0415bc9970617dce56"} Dec 09 03:16:12 crc kubenswrapper[4766]: I1209 03:16:12.902475 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:12 crc kubenswrapper[4766]: I1209 03:16:12.902722 4766 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:12 crc kubenswrapper[4766]: I1209 03:16:12.902757 4766 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:13 crc kubenswrapper[4766]: I1209 03:16:13.392634 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:16:13 crc kubenswrapper[4766]: I1209 03:16:13.392990 4766 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 09 03:16:13 crc kubenswrapper[4766]: I1209 03:16:13.393055 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 09 03:16:14 crc kubenswrapper[4766]: I1209 03:16:14.039181 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:16:14 crc kubenswrapper[4766]: I1209 03:16:14.861249 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:14 crc kubenswrapper[4766]: I1209 03:16:14.861348 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:14 crc kubenswrapper[4766]: I1209 03:16:14.870903 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:17 crc kubenswrapper[4766]: I1209 03:16:17.910434 4766 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:17 crc kubenswrapper[4766]: I1209 03:16:17.934276 4766 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:17 crc kubenswrapper[4766]: I1209 03:16:17.934630 4766 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:17 crc kubenswrapper[4766]: I1209 03:16:17.939670 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:18 crc kubenswrapper[4766]: I1209 03:16:18.858457 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d33625fd-f8b3-40e8-a933-f261ef18ce75" Dec 09 03:16:18 crc kubenswrapper[4766]: I1209 03:16:18.940363 4766 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:18 crc kubenswrapper[4766]: I1209 03:16:18.940412 4766 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d9df7434-1d9b-4c4f-96b5-4ab3e4be8b17" Dec 09 03:16:18 crc kubenswrapper[4766]: I1209 03:16:18.944461 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d33625fd-f8b3-40e8-a933-f261ef18ce75" Dec 09 03:16:23 crc kubenswrapper[4766]: I1209 03:16:23.392494 4766 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 09 03:16:23 crc kubenswrapper[4766]: I1209 03:16:23.393032 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 09 03:16:24 crc kubenswrapper[4766]: I1209 03:16:24.758821 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 03:16:26 crc kubenswrapper[4766]: I1209 03:16:26.124323 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 03:16:27 crc kubenswrapper[4766]: I1209 03:16:27.038255 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 03:16:27 crc kubenswrapper[4766]: I1209 03:16:27.518409 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 03:16:28 crc kubenswrapper[4766]: I1209 03:16:28.128897 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 03:16:28 crc kubenswrapper[4766]: I1209 03:16:28.369470 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 03:16:28 crc kubenswrapper[4766]: I1209 03:16:28.875676 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 03:16:29 crc kubenswrapper[4766]: I1209 03:16:29.310977 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 03:16:29 crc kubenswrapper[4766]: I1209 03:16:29.640580 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.043994 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.275962 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.287194 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.463187 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.495199 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.623707 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.675261 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.754149 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.816807 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 03:16:30 crc kubenswrapper[4766]: I1209 03:16:30.964954 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.164584 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.186662 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.258617 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.269767 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.560195 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.717150 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.728359 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.752635 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.873976 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 03:16:31 crc kubenswrapper[4766]: I1209 03:16:31.949292 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.009271 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.119656 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.187924 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.224055 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.261603 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.354942 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.387129 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.401526 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.438819 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.514593 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.517586 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.524644 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.529983 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.610395 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.639810 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.650378 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.769176 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.795010 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.830148 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 03:16:32 crc kubenswrapper[4766]: I1209 03:16:32.866356 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.151894 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.200575 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.330231 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.352167 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.362338 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.369138 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.392617 4766 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.392713 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.392794 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.393687 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"49ff95970c00b7e1bfc02dbcc9d587d91db93912cbe67559418d50257b0e8db5"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.393919 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://49ff95970c00b7e1bfc02dbcc9d587d91db93912cbe67559418d50257b0e8db5" gracePeriod=30 Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.589845 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.691406 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.710916 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.968627 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 03:16:33 crc kubenswrapper[4766]: I1209 03:16:33.980611 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.012235 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.036845 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.164419 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.168695 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.203690 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.220581 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.229646 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.246279 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.249755 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.382984 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.471744 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.693982 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.735381 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.756639 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.862112 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.874377 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.894321 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.972004 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 03:16:34 crc kubenswrapper[4766]: I1209 03:16:34.989006 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.056881 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.151525 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.153072 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.166069 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.274321 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.279455 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.684785 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.702918 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.856600 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.865198 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.891458 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 03:16:35 crc kubenswrapper[4766]: I1209 03:16:35.928995 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.000632 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.017973 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.089960 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.100861 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.113816 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.137591 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.324313 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.410870 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.486452 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.654128 4766 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.664775 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.727141 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.972389 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 03:16:36 crc kubenswrapper[4766]: I1209 03:16:36.994509 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.034888 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.072814 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.292642 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.321694 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.362916 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.392733 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.420112 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.439474 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.508095 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.594625 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.614481 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.667747 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.699788 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.790984 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.810455 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.819841 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.939893 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.942095 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 03:16:37 crc kubenswrapper[4766]: I1209 03:16:37.984022 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.008525 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.013828 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.104567 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.107805 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.141721 4766 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.150719 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-fxqs4"] Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.150811 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.158623 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.182773 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.182753252 podStartE2EDuration="21.182753252s" podCreationTimestamp="2025-12-09 03:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:16:38.176861066 +0000 UTC m=+279.886166532" watchObservedRunningTime="2025-12-09 03:16:38.182753252 +0000 UTC m=+279.892058708" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.240829 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.243974 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86449fd7db-ttjdz"] Dec 09 03:16:38 crc kubenswrapper[4766]: E1209 03:16:38.244191 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2facf5-7977-44e9-beea-141276d212a5" containerName="oauth-openshift" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.244233 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2facf5-7977-44e9-beea-141276d212a5" containerName="oauth-openshift" Dec 09 03:16:38 crc kubenswrapper[4766]: E1209 03:16:38.244264 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" containerName="installer" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.244272 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" containerName="installer" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.244396 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2facf5-7977-44e9-beea-141276d212a5" containerName="oauth-openshift" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.244426 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4256c428-3231-47d2-9a7d-a78c2214988b" containerName="installer" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.244977 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.249271 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.249609 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.249570 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.250294 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.250578 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.250665 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.250859 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.250860 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.251941 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.251950 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.253002 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.253403 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.261653 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.264658 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.271288 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.319659 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.341949 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.362768 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.362831 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.362863 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363142 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-router-certs\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363300 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-template-error\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363391 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363471 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363528 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-service-ca\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363633 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-session\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363707 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-audit-policies\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363758 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bdeacd90-386b-4953-ba45-411e1cde2ed9-audit-dir\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363820 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-template-login\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363856 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hllgx\" (UniqueName: \"kubernetes.io/projected/bdeacd90-386b-4953-ba45-411e1cde2ed9-kube-api-access-hllgx\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.363889 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.376412 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.397283 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.461458 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465331 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-session\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465399 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-audit-policies\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465429 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bdeacd90-386b-4953-ba45-411e1cde2ed9-audit-dir\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465461 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-template-login\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465487 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hllgx\" (UniqueName: \"kubernetes.io/projected/bdeacd90-386b-4953-ba45-411e1cde2ed9-kube-api-access-hllgx\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465514 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465533 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465556 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bdeacd90-386b-4953-ba45-411e1cde2ed9-audit-dir\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465586 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465653 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-router-certs\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465707 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-template-error\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.466811 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.465736 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.466923 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.466973 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-service-ca\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.467049 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-audit-policies\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.467688 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.467724 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-service-ca\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.472990 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-router-certs\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.473000 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.473455 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.473520 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-system-session\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.473529 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-template-error\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.473883 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.475868 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.477788 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.490816 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bdeacd90-386b-4953-ba45-411e1cde2ed9-v4-0-config-user-template-login\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.499958 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.506946 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hllgx\" (UniqueName: \"kubernetes.io/projected/bdeacd90-386b-4953-ba45-411e1cde2ed9-kube-api-access-hllgx\") pod \"oauth-openshift-86449fd7db-ttjdz\" (UID: \"bdeacd90-386b-4953-ba45-411e1cde2ed9\") " pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.540162 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.576710 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.622666 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.718437 4766 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.794816 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.827895 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.847128 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f2facf5-7977-44e9-beea-141276d212a5" path="/var/lib/kubelet/pods/2f2facf5-7977-44e9-beea-141276d212a5/volumes" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.859961 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 03:16:38 crc kubenswrapper[4766]: I1209 03:16:38.969770 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.075291 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.124365 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.147327 4766 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.236808 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.282013 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.325848 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.338317 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.387565 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.567061 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.744280 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.788434 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.870318 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.875092 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.954046 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 03:16:39 crc kubenswrapper[4766]: I1209 03:16:39.983991 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.000292 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.028132 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.096811 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.216518 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.226551 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.243561 4766 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.243959 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f" gracePeriod=5 Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.334893 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.400968 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.404243 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.418630 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.434029 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.510759 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.513422 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.600014 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.615036 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.639493 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.826821 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 03:16:40 crc kubenswrapper[4766]: I1209 03:16:40.908574 4766 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.049097 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.091163 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.119090 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.165375 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.213397 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.233420 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.275775 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.353288 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.379341 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.451159 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.525094 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.551702 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.556648 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.576047 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.603998 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.873381 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.877573 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 03:16:41 crc kubenswrapper[4766]: I1209 03:16:41.912939 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.053153 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.187034 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.258071 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.343774 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.561698 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.714520 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.720875 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.736756 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.746898 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.764094 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 03:16:42 crc kubenswrapper[4766]: I1209 03:16:42.812952 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 03:16:43 crc kubenswrapper[4766]: I1209 03:16:43.077485 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 03:16:43 crc kubenswrapper[4766]: I1209 03:16:43.079569 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 03:16:43 crc kubenswrapper[4766]: I1209 03:16:43.154055 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 03:16:43 crc kubenswrapper[4766]: I1209 03:16:43.202201 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 03:16:43 crc kubenswrapper[4766]: I1209 03:16:43.424291 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 03:16:43 crc kubenswrapper[4766]: I1209 03:16:43.635914 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 03:16:43 crc kubenswrapper[4766]: I1209 03:16:43.698657 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 03:16:43 crc kubenswrapper[4766]: I1209 03:16:43.942144 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 03:16:44 crc kubenswrapper[4766]: I1209 03:16:44.029105 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 03:16:44 crc kubenswrapper[4766]: I1209 03:16:44.048566 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 03:16:44 crc kubenswrapper[4766]: I1209 03:16:44.086208 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 03:16:44 crc kubenswrapper[4766]: I1209 03:16:44.185254 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 03:16:44 crc kubenswrapper[4766]: I1209 03:16:44.329108 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 03:16:44 crc kubenswrapper[4766]: I1209 03:16:44.404089 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 03:16:44 crc kubenswrapper[4766]: I1209 03:16:44.573773 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 03:16:44 crc kubenswrapper[4766]: I1209 03:16:44.891633 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 03:16:44 crc kubenswrapper[4766]: I1209 03:16:44.923666 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.065975 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.072083 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86449fd7db-ttjdz"] Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.100537 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.227862 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.273357 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.370897 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.376582 4766 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.418442 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86449fd7db-ttjdz"] Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.458462 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.472985 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.592247 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.782063 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.848745 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.848855 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873397 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873466 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873496 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873508 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873556 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873615 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873560 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873728 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873766 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873964 4766 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.873988 4766 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.874006 4766 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.886534 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.975448 4766 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:45 crc kubenswrapper[4766]: I1209 03:16:45.975520 4766 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.139623 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.139673 4766 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f" exitCode=137 Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.139743 4766 scope.go:117] "RemoveContainer" containerID="05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.139794 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.141527 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" event={"ID":"bdeacd90-386b-4953-ba45-411e1cde2ed9","Type":"ContainerStarted","Data":"071024a85e0a9ce51a9ec295eb61883feb5269a9c487119f2efce95f6db0d06e"} Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.141564 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" event={"ID":"bdeacd90-386b-4953-ba45-411e1cde2ed9","Type":"ContainerStarted","Data":"313112719a02af9dc2c3f68020c3b616bf11046b39378d734bf3e7abf5f79d4c"} Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.141825 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.151164 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.158346 4766 scope.go:117] "RemoveContainer" containerID="05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f" Dec 09 03:16:46 crc kubenswrapper[4766]: E1209 03:16:46.158945 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f\": container with ID starting with 05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f not found: ID does not exist" containerID="05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.158994 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f"} err="failed to get container status \"05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f\": rpc error: code = NotFound desc = could not find container \"05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f\": container with ID starting with 05ca85eb563e5aaecf932b76432627327add3bd765e4d54a91d270e3663afc3f not found: ID does not exist" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.177413 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-86449fd7db-ttjdz" podStartSLOduration=61.177382146 podStartE2EDuration="1m1.177382146s" podCreationTimestamp="2025-12-09 03:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:16:46.175568668 +0000 UTC m=+287.884874134" watchObservedRunningTime="2025-12-09 03:16:46.177382146 +0000 UTC m=+287.886687612" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.530140 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 03:16:46 crc kubenswrapper[4766]: I1209 03:16:46.854002 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 09 03:16:47 crc kubenswrapper[4766]: I1209 03:16:47.122643 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 03:16:47 crc kubenswrapper[4766]: I1209 03:16:47.324389 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 03:16:47 crc kubenswrapper[4766]: I1209 03:16:47.540832 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 03:16:48 crc kubenswrapper[4766]: I1209 03:16:48.043537 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 03:16:58 crc kubenswrapper[4766]: I1209 03:16:58.697687 4766 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 09 03:17:02 crc kubenswrapper[4766]: E1209 03:17:02.467236 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf839f7fc_1e7f_4c71_9c02_f456ffacb094.slice/crio-12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60.scope\": RecentStats: unable to find data in memory cache]" Dec 09 03:17:03 crc kubenswrapper[4766]: I1209 03:17:03.271107 4766 generic.go:334] "Generic (PLEG): container finished" podID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerID="12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60" exitCode=0 Dec 09 03:17:03 crc kubenswrapper[4766]: I1209 03:17:03.271237 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" event={"ID":"f839f7fc-1e7f-4c71-9c02-f456ffacb094","Type":"ContainerDied","Data":"12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60"} Dec 09 03:17:03 crc kubenswrapper[4766]: I1209 03:17:03.272401 4766 scope.go:117] "RemoveContainer" containerID="12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60" Dec 09 03:17:04 crc kubenswrapper[4766]: I1209 03:17:04.282916 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 09 03:17:04 crc kubenswrapper[4766]: I1209 03:17:04.286876 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 03:17:04 crc kubenswrapper[4766]: I1209 03:17:04.287004 4766 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="49ff95970c00b7e1bfc02dbcc9d587d91db93912cbe67559418d50257b0e8db5" exitCode=137 Dec 09 03:17:04 crc kubenswrapper[4766]: I1209 03:17:04.287192 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"49ff95970c00b7e1bfc02dbcc9d587d91db93912cbe67559418d50257b0e8db5"} Dec 09 03:17:04 crc kubenswrapper[4766]: I1209 03:17:04.287336 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"63a4cd4cd3c2576fc1d494f99ef12e75b3ab4a6a79f4240dc6df205eeec92bb3"} Dec 09 03:17:04 crc kubenswrapper[4766]: I1209 03:17:04.287371 4766 scope.go:117] "RemoveContainer" containerID="d737b63407665158cc9a2d35bbb5bf8b1d5a49188b1ad78e53e0577fc0bc45df" Dec 09 03:17:04 crc kubenswrapper[4766]: I1209 03:17:04.291264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" event={"ID":"f839f7fc-1e7f-4c71-9c02-f456ffacb094","Type":"ContainerStarted","Data":"a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5"} Dec 09 03:17:04 crc kubenswrapper[4766]: I1209 03:17:04.293458 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:17:04 crc kubenswrapper[4766]: I1209 03:17:04.302688 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:17:05 crc kubenswrapper[4766]: I1209 03:17:05.306008 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 09 03:17:13 crc kubenswrapper[4766]: I1209 03:17:13.392338 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:17:13 crc kubenswrapper[4766]: I1209 03:17:13.399991 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:17:14 crc kubenswrapper[4766]: I1209 03:17:14.039935 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:17:14 crc kubenswrapper[4766]: I1209 03:17:14.045329 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 03:17:33 crc kubenswrapper[4766]: I1209 03:17:33.607490 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n6tj2"] Dec 09 03:17:33 crc kubenswrapper[4766]: I1209 03:17:33.608867 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" podUID="882f22c6-5509-4647-a337-121cca0e1622" containerName="controller-manager" containerID="cri-o://128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df" gracePeriod=30 Dec 09 03:17:33 crc kubenswrapper[4766]: I1209 03:17:33.703810 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs"] Dec 09 03:17:33 crc kubenswrapper[4766]: I1209 03:17:33.704003 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" podUID="21953e96-95ef-438b-a25a-e70f7ad6f7be" containerName="route-controller-manager" containerID="cri-o://2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2" gracePeriod=30 Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.145364 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.178660 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.200599 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882f22c6-5509-4647-a337-121cca0e1622-serving-cert\") pod \"882f22c6-5509-4647-a337-121cca0e1622\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.200693 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bqxn\" (UniqueName: \"kubernetes.io/projected/882f22c6-5509-4647-a337-121cca0e1622-kube-api-access-2bqxn\") pod \"882f22c6-5509-4647-a337-121cca0e1622\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.200761 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-client-ca\") pod \"882f22c6-5509-4647-a337-121cca0e1622\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.200821 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-proxy-ca-bundles\") pod \"882f22c6-5509-4647-a337-121cca0e1622\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.200858 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-config\") pod \"882f22c6-5509-4647-a337-121cca0e1622\" (UID: \"882f22c6-5509-4647-a337-121cca0e1622\") " Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.201582 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-client-ca" (OuterVolumeSpecName: "client-ca") pod "882f22c6-5509-4647-a337-121cca0e1622" (UID: "882f22c6-5509-4647-a337-121cca0e1622"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.201777 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "882f22c6-5509-4647-a337-121cca0e1622" (UID: "882f22c6-5509-4647-a337-121cca0e1622"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.201792 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-config" (OuterVolumeSpecName: "config") pod "882f22c6-5509-4647-a337-121cca0e1622" (UID: "882f22c6-5509-4647-a337-121cca0e1622"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.207417 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882f22c6-5509-4647-a337-121cca0e1622-kube-api-access-2bqxn" (OuterVolumeSpecName: "kube-api-access-2bqxn") pod "882f22c6-5509-4647-a337-121cca0e1622" (UID: "882f22c6-5509-4647-a337-121cca0e1622"). InnerVolumeSpecName "kube-api-access-2bqxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.207558 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882f22c6-5509-4647-a337-121cca0e1622-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "882f22c6-5509-4647-a337-121cca0e1622" (UID: "882f22c6-5509-4647-a337-121cca0e1622"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.301700 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4fnc\" (UniqueName: \"kubernetes.io/projected/21953e96-95ef-438b-a25a-e70f7ad6f7be-kube-api-access-w4fnc\") pod \"21953e96-95ef-438b-a25a-e70f7ad6f7be\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.301804 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21953e96-95ef-438b-a25a-e70f7ad6f7be-serving-cert\") pod \"21953e96-95ef-438b-a25a-e70f7ad6f7be\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.301830 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-client-ca\") pod \"21953e96-95ef-438b-a25a-e70f7ad6f7be\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.301901 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-config\") pod \"21953e96-95ef-438b-a25a-e70f7ad6f7be\" (UID: \"21953e96-95ef-438b-a25a-e70f7ad6f7be\") " Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.302074 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.302086 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.302095 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882f22c6-5509-4647-a337-121cca0e1622-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.302103 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882f22c6-5509-4647-a337-121cca0e1622-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.302113 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bqxn\" (UniqueName: \"kubernetes.io/projected/882f22c6-5509-4647-a337-121cca0e1622-kube-api-access-2bqxn\") on node \"crc\" DevicePath \"\"" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.302909 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-config" (OuterVolumeSpecName: "config") pod "21953e96-95ef-438b-a25a-e70f7ad6f7be" (UID: "21953e96-95ef-438b-a25a-e70f7ad6f7be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.303770 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-client-ca" (OuterVolumeSpecName: "client-ca") pod "21953e96-95ef-438b-a25a-e70f7ad6f7be" (UID: "21953e96-95ef-438b-a25a-e70f7ad6f7be"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.305739 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21953e96-95ef-438b-a25a-e70f7ad6f7be-kube-api-access-w4fnc" (OuterVolumeSpecName: "kube-api-access-w4fnc") pod "21953e96-95ef-438b-a25a-e70f7ad6f7be" (UID: "21953e96-95ef-438b-a25a-e70f7ad6f7be"). InnerVolumeSpecName "kube-api-access-w4fnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.309703 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21953e96-95ef-438b-a25a-e70f7ad6f7be-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "21953e96-95ef-438b-a25a-e70f7ad6f7be" (UID: "21953e96-95ef-438b-a25a-e70f7ad6f7be"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.403394 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21953e96-95ef-438b-a25a-e70f7ad6f7be-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.403453 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.403472 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21953e96-95ef-438b-a25a-e70f7ad6f7be-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.403494 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4fnc\" (UniqueName: \"kubernetes.io/projected/21953e96-95ef-438b-a25a-e70f7ad6f7be-kube-api-access-w4fnc\") on node \"crc\" DevicePath \"\"" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.513666 4766 generic.go:334] "Generic (PLEG): container finished" podID="21953e96-95ef-438b-a25a-e70f7ad6f7be" containerID="2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2" exitCode=0 Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.513776 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" event={"ID":"21953e96-95ef-438b-a25a-e70f7ad6f7be","Type":"ContainerDied","Data":"2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2"} Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.513818 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" event={"ID":"21953e96-95ef-438b-a25a-e70f7ad6f7be","Type":"ContainerDied","Data":"4a9500716b90a39d8e7b53529b8200dd83fdccf57dae0c305cda37b9dfb28aff"} Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.513851 4766 scope.go:117] "RemoveContainer" containerID="2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.514028 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.519150 4766 generic.go:334] "Generic (PLEG): container finished" podID="882f22c6-5509-4647-a337-121cca0e1622" containerID="128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df" exitCode=0 Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.519288 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" event={"ID":"882f22c6-5509-4647-a337-121cca0e1622","Type":"ContainerDied","Data":"128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df"} Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.519340 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.519351 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n6tj2" event={"ID":"882f22c6-5509-4647-a337-121cca0e1622","Type":"ContainerDied","Data":"7d93a1cad014c8e16fb6a5a9964397014f7d86040adb5fbe7e5b895e5c821bba"} Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.533095 4766 scope.go:117] "RemoveContainer" containerID="2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2" Dec 09 03:17:34 crc kubenswrapper[4766]: E1209 03:17:34.533803 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2\": container with ID starting with 2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2 not found: ID does not exist" containerID="2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.533854 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2"} err="failed to get container status \"2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2\": rpc error: code = NotFound desc = could not find container \"2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2\": container with ID starting with 2c84f95e0c5618728730cd63f2515ee49d8ef7560d6bd5c993e35f7f689df4f2 not found: ID does not exist" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.533883 4766 scope.go:117] "RemoveContainer" containerID="128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.546502 4766 scope.go:117] "RemoveContainer" containerID="128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df" Dec 09 03:17:34 crc kubenswrapper[4766]: E1209 03:17:34.547085 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df\": container with ID starting with 128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df not found: ID does not exist" containerID="128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.547314 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df"} err="failed to get container status \"128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df\": rpc error: code = NotFound desc = could not find container \"128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df\": container with ID starting with 128a1f27edc51a667a6b6386dacfbbf0ab7e01b39620cf59c82ad35126eb87df not found: ID does not exist" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.564998 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n6tj2"] Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.573673 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n6tj2"] Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.579347 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs"] Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.583739 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-nvxzs"] Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.848781 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21953e96-95ef-438b-a25a-e70f7ad6f7be" path="/var/lib/kubelet/pods/21953e96-95ef-438b-a25a-e70f7ad6f7be/volumes" Dec 09 03:17:34 crc kubenswrapper[4766]: I1209 03:17:34.849428 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="882f22c6-5509-4647-a337-121cca0e1622" path="/var/lib/kubelet/pods/882f22c6-5509-4647-a337-121cca0e1622/volumes" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.993035 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm"] Dec 09 03:17:35 crc kubenswrapper[4766]: E1209 03:17:35.993591 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882f22c6-5509-4647-a337-121cca0e1622" containerName="controller-manager" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.993608 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="882f22c6-5509-4647-a337-121cca0e1622" containerName="controller-manager" Dec 09 03:17:35 crc kubenswrapper[4766]: E1209 03:17:35.993634 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21953e96-95ef-438b-a25a-e70f7ad6f7be" containerName="route-controller-manager" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.993642 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="21953e96-95ef-438b-a25a-e70f7ad6f7be" containerName="route-controller-manager" Dec 09 03:17:35 crc kubenswrapper[4766]: E1209 03:17:35.993656 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.993665 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.993768 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.993787 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="21953e96-95ef-438b-a25a-e70f7ad6f7be" containerName="route-controller-manager" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.993797 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="882f22c6-5509-4647-a337-121cca0e1622" containerName="controller-manager" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.994232 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.995858 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.997161 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.997690 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.997912 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.998173 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 03:17:35 crc kubenswrapper[4766]: I1209 03:17:35.998417 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.004109 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7ff9c64758-r5bsz"] Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.004810 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.009681 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.010550 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.010698 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.010726 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.010835 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.010864 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.013953 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm"] Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.023806 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.023859 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8k4q\" (UniqueName: \"kubernetes.io/projected/871504f6-fe8a-4ac1-a97d-bb4125c99736-kube-api-access-c8k4q\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.023931 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/871504f6-fe8a-4ac1-a97d-bb4125c99736-serving-cert\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.023981 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-config\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.024017 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-client-ca\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.029710 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ff9c64758-r5bsz"] Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.125063 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8k4q\" (UniqueName: \"kubernetes.io/projected/871504f6-fe8a-4ac1-a97d-bb4125c99736-kube-api-access-c8k4q\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.125126 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-config\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.125171 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/871504f6-fe8a-4ac1-a97d-bb4125c99736-serving-cert\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.125353 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-config\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.125457 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-client-ca\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.125532 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11a7d2c4-1941-48d4-9285-210f025702f6-serving-cert\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.125583 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bflm\" (UniqueName: \"kubernetes.io/projected/11a7d2c4-1941-48d4-9285-210f025702f6-kube-api-access-8bflm\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.125620 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-proxy-ca-bundles\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.125691 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-client-ca\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.126537 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-client-ca\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.126664 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-config\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.129559 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/871504f6-fe8a-4ac1-a97d-bb4125c99736-serving-cert\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.141042 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8k4q\" (UniqueName: \"kubernetes.io/projected/871504f6-fe8a-4ac1-a97d-bb4125c99736-kube-api-access-c8k4q\") pod \"route-controller-manager-8587bbf9b-2n9pm\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.226525 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-config\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.226630 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11a7d2c4-1941-48d4-9285-210f025702f6-serving-cert\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.226664 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bflm\" (UniqueName: \"kubernetes.io/projected/11a7d2c4-1941-48d4-9285-210f025702f6-kube-api-access-8bflm\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.226702 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-proxy-ca-bundles\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.226749 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-client-ca\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.228180 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-client-ca\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.228386 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-proxy-ca-bundles\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.228458 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-config\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.234031 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11a7d2c4-1941-48d4-9285-210f025702f6-serving-cert\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.242653 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bflm\" (UniqueName: \"kubernetes.io/projected/11a7d2c4-1941-48d4-9285-210f025702f6-kube-api-access-8bflm\") pod \"controller-manager-7ff9c64758-r5bsz\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.325280 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.352320 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.572818 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ff9c64758-r5bsz"] Dec 09 03:17:36 crc kubenswrapper[4766]: I1209 03:17:36.736075 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm"] Dec 09 03:17:36 crc kubenswrapper[4766]: W1209 03:17:36.738915 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod871504f6_fe8a_4ac1_a97d_bb4125c99736.slice/crio-44a1f5b838e067e62ab269ea1b35eab37c9adfc0ab4db7c8f98b783bb5261789 WatchSource:0}: Error finding container 44a1f5b838e067e62ab269ea1b35eab37c9adfc0ab4db7c8f98b783bb5261789: Status 404 returned error can't find the container with id 44a1f5b838e067e62ab269ea1b35eab37c9adfc0ab4db7c8f98b783bb5261789 Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.546580 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" event={"ID":"871504f6-fe8a-4ac1-a97d-bb4125c99736","Type":"ContainerStarted","Data":"3f515691600afc71f1ba9dfb560c811b3318a90b22a17a399e21fe7ec55d0962"} Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.546637 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" event={"ID":"871504f6-fe8a-4ac1-a97d-bb4125c99736","Type":"ContainerStarted","Data":"44a1f5b838e067e62ab269ea1b35eab37c9adfc0ab4db7c8f98b783bb5261789"} Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.546896 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.549501 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" event={"ID":"11a7d2c4-1941-48d4-9285-210f025702f6","Type":"ContainerStarted","Data":"9ca823d62854368c6a1e1c15652500e590b0f42d6a821c6e8ba8f697f182b435"} Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.549539 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" event={"ID":"11a7d2c4-1941-48d4-9285-210f025702f6","Type":"ContainerStarted","Data":"45e2fdeca0b22b897133233d692f0365a4639c91fb5b4060acfa30a86b29af0f"} Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.549806 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.554359 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.555141 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.567568 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" podStartSLOduration=3.567553164 podStartE2EDuration="3.567553164s" podCreationTimestamp="2025-12-09 03:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:17:37.564272877 +0000 UTC m=+339.273578313" watchObservedRunningTime="2025-12-09 03:17:37.567553164 +0000 UTC m=+339.276858590" Dec 09 03:17:37 crc kubenswrapper[4766]: I1209 03:17:37.597517 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" podStartSLOduration=4.597501545 podStartE2EDuration="4.597501545s" podCreationTimestamp="2025-12-09 03:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:17:37.59425657 +0000 UTC m=+339.303561996" watchObservedRunningTime="2025-12-09 03:17:37.597501545 +0000 UTC m=+339.306806971" Dec 09 03:18:07 crc kubenswrapper[4766]: I1209 03:18:07.316551 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:18:07 crc kubenswrapper[4766]: I1209 03:18:07.317134 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.802430 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vt2b"] Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.804309 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6vt2b" podUID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerName="registry-server" containerID="cri-o://b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34" gracePeriod=30 Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.821298 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6mbs"] Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.821535 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j6mbs" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerName="registry-server" containerID="cri-o://c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6" gracePeriod=30 Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.849667 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swbbr"] Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.849724 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxzck"] Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.849979 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" podUID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerName="marketplace-operator" containerID="cri-o://a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5" gracePeriod=30 Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.850400 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pxzck" podUID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerName="registry-server" containerID="cri-o://7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90" gracePeriod=30 Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.853086 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnkzl"] Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.853394 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qnkzl" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerName="registry-server" containerID="cri-o://3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e" gracePeriod=30 Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.854971 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wccgw"] Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.855651 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.883993 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wccgw"] Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.986197 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4111b91-5da1-4878-bcd4-2f2b34174e52-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wccgw\" (UID: \"b4111b91-5da1-4878-bcd4-2f2b34174e52\") " pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.986280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b4111b91-5da1-4878-bcd4-2f2b34174e52-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wccgw\" (UID: \"b4111b91-5da1-4878-bcd4-2f2b34174e52\") " pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:12 crc kubenswrapper[4766]: I1209 03:18:12.986361 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcld5\" (UniqueName: \"kubernetes.io/projected/b4111b91-5da1-4878-bcd4-2f2b34174e52-kube-api-access-jcld5\") pod \"marketplace-operator-79b997595-wccgw\" (UID: \"b4111b91-5da1-4878-bcd4-2f2b34174e52\") " pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.087006 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4111b91-5da1-4878-bcd4-2f2b34174e52-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wccgw\" (UID: \"b4111b91-5da1-4878-bcd4-2f2b34174e52\") " pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.087120 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b4111b91-5da1-4878-bcd4-2f2b34174e52-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wccgw\" (UID: \"b4111b91-5da1-4878-bcd4-2f2b34174e52\") " pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.087171 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcld5\" (UniqueName: \"kubernetes.io/projected/b4111b91-5da1-4878-bcd4-2f2b34174e52-kube-api-access-jcld5\") pod \"marketplace-operator-79b997595-wccgw\" (UID: \"b4111b91-5da1-4878-bcd4-2f2b34174e52\") " pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.088430 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4111b91-5da1-4878-bcd4-2f2b34174e52-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wccgw\" (UID: \"b4111b91-5da1-4878-bcd4-2f2b34174e52\") " pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.095822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b4111b91-5da1-4878-bcd4-2f2b34174e52-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wccgw\" (UID: \"b4111b91-5da1-4878-bcd4-2f2b34174e52\") " pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.108617 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcld5\" (UniqueName: \"kubernetes.io/projected/b4111b91-5da1-4878-bcd4-2f2b34174e52-kube-api-access-jcld5\") pod \"marketplace-operator-79b997595-wccgw\" (UID: \"b4111b91-5da1-4878-bcd4-2f2b34174e52\") " pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.234555 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.297483 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:18:13 crc kubenswrapper[4766]: E1209 03:18:13.323935 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6 is running failed: container process not found" containerID="c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 03:18:13 crc kubenswrapper[4766]: E1209 03:18:13.324506 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6 is running failed: container process not found" containerID="c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 03:18:13 crc kubenswrapper[4766]: E1209 03:18:13.324835 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6 is running failed: container process not found" containerID="c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 03:18:13 crc kubenswrapper[4766]: E1209 03:18:13.324865 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-j6mbs" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerName="registry-server" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.351709 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.396411 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-utilities\") pod \"dd777685-39e7-46bb-824f-f19ceaa179ca\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.396515 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlrzt\" (UniqueName: \"kubernetes.io/projected/dd777685-39e7-46bb-824f-f19ceaa179ca-kube-api-access-dlrzt\") pod \"dd777685-39e7-46bb-824f-f19ceaa179ca\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.396563 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-catalog-content\") pod \"dd777685-39e7-46bb-824f-f19ceaa179ca\" (UID: \"dd777685-39e7-46bb-824f-f19ceaa179ca\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.401703 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-utilities" (OuterVolumeSpecName: "utilities") pod "dd777685-39e7-46bb-824f-f19ceaa179ca" (UID: "dd777685-39e7-46bb-824f-f19ceaa179ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.405477 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd777685-39e7-46bb-824f-f19ceaa179ca-kube-api-access-dlrzt" (OuterVolumeSpecName: "kube-api-access-dlrzt") pod "dd777685-39e7-46bb-824f-f19ceaa179ca" (UID: "dd777685-39e7-46bb-824f-f19ceaa179ca"). InnerVolumeSpecName "kube-api-access-dlrzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.405607 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.411752 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.423108 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.474704 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd777685-39e7-46bb-824f-f19ceaa179ca" (UID: "dd777685-39e7-46bb-824f-f19ceaa179ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.498633 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-utilities\") pod \"b2c59da0-4314-401f-9b47-cd3df45f4d26\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.498702 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-operator-metrics\") pod \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.498863 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-catalog-content\") pod \"9fee64f2-1434-4082-9674-088e4d93cb9a\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.498908 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-utilities\") pod \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.498955 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-trusted-ca\") pod \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.498971 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-catalog-content\") pod \"b2c59da0-4314-401f-9b47-cd3df45f4d26\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.499002 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-catalog-content\") pod \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.499024 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-utilities\") pod \"9fee64f2-1434-4082-9674-088e4d93cb9a\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.499048 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tmvk\" (UniqueName: \"kubernetes.io/projected/9fee64f2-1434-4082-9674-088e4d93cb9a-kube-api-access-8tmvk\") pod \"9fee64f2-1434-4082-9674-088e4d93cb9a\" (UID: \"9fee64f2-1434-4082-9674-088e4d93cb9a\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.499067 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kfdv\" (UniqueName: \"kubernetes.io/projected/f839f7fc-1e7f-4c71-9c02-f456ffacb094-kube-api-access-6kfdv\") pod \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\" (UID: \"f839f7fc-1e7f-4c71-9c02-f456ffacb094\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.499098 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfm7z\" (UniqueName: \"kubernetes.io/projected/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-kube-api-access-bfm7z\") pod \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\" (UID: \"8a05d77b-68a0-4e96-b71e-f5168ee4d38f\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.502684 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-utilities" (OuterVolumeSpecName: "utilities") pod "b2c59da0-4314-401f-9b47-cd3df45f4d26" (UID: "b2c59da0-4314-401f-9b47-cd3df45f4d26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.502726 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8f2q\" (UniqueName: \"kubernetes.io/projected/b2c59da0-4314-401f-9b47-cd3df45f4d26-kube-api-access-h8f2q\") pod \"b2c59da0-4314-401f-9b47-cd3df45f4d26\" (UID: \"b2c59da0-4314-401f-9b47-cd3df45f4d26\") " Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.503163 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-utilities" (OuterVolumeSpecName: "utilities") pod "8a05d77b-68a0-4e96-b71e-f5168ee4d38f" (UID: "8a05d77b-68a0-4e96-b71e-f5168ee4d38f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.503656 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f839f7fc-1e7f-4c71-9c02-f456ffacb094" (UID: "f839f7fc-1e7f-4c71-9c02-f456ffacb094"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.513526 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-utilities" (OuterVolumeSpecName: "utilities") pod "9fee64f2-1434-4082-9674-088e4d93cb9a" (UID: "9fee64f2-1434-4082-9674-088e4d93cb9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.517689 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.518125 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.518147 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlrzt\" (UniqueName: \"kubernetes.io/projected/dd777685-39e7-46bb-824f-f19ceaa179ca-kube-api-access-dlrzt\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.518160 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.518174 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd777685-39e7-46bb-824f-f19ceaa179ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.518459 4766 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.525704 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f839f7fc-1e7f-4c71-9c02-f456ffacb094-kube-api-access-6kfdv" (OuterVolumeSpecName: "kube-api-access-6kfdv") pod "f839f7fc-1e7f-4c71-9c02-f456ffacb094" (UID: "f839f7fc-1e7f-4c71-9c02-f456ffacb094"). InnerVolumeSpecName "kube-api-access-6kfdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.539539 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f839f7fc-1e7f-4c71-9c02-f456ffacb094" (UID: "f839f7fc-1e7f-4c71-9c02-f456ffacb094"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.540017 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fee64f2-1434-4082-9674-088e4d93cb9a-kube-api-access-8tmvk" (OuterVolumeSpecName: "kube-api-access-8tmvk") pod "9fee64f2-1434-4082-9674-088e4d93cb9a" (UID: "9fee64f2-1434-4082-9674-088e4d93cb9a"). InnerVolumeSpecName "kube-api-access-8tmvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.541121 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-kube-api-access-bfm7z" (OuterVolumeSpecName: "kube-api-access-bfm7z") pod "8a05d77b-68a0-4e96-b71e-f5168ee4d38f" (UID: "8a05d77b-68a0-4e96-b71e-f5168ee4d38f"). InnerVolumeSpecName "kube-api-access-bfm7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.546528 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c59da0-4314-401f-9b47-cd3df45f4d26-kube-api-access-h8f2q" (OuterVolumeSpecName: "kube-api-access-h8f2q") pod "b2c59da0-4314-401f-9b47-cd3df45f4d26" (UID: "b2c59da0-4314-401f-9b47-cd3df45f4d26"). InnerVolumeSpecName "kube-api-access-h8f2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.555236 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2c59da0-4314-401f-9b47-cd3df45f4d26" (UID: "b2c59da0-4314-401f-9b47-cd3df45f4d26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.599331 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ff9c64758-r5bsz"] Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.600035 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" podUID="11a7d2c4-1941-48d4-9285-210f025702f6" containerName="controller-manager" containerID="cri-o://9ca823d62854368c6a1e1c15652500e590b0f42d6a821c6e8ba8f697f182b435" gracePeriod=30 Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.607716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a05d77b-68a0-4e96-b71e-f5168ee4d38f" (UID: "8a05d77b-68a0-4e96-b71e-f5168ee4d38f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.617798 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm"] Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.620553 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c59da0-4314-401f-9b47-cd3df45f4d26-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.620587 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.620609 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.620618 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tmvk\" (UniqueName: \"kubernetes.io/projected/9fee64f2-1434-4082-9674-088e4d93cb9a-kube-api-access-8tmvk\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.620630 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kfdv\" (UniqueName: \"kubernetes.io/projected/f839f7fc-1e7f-4c71-9c02-f456ffacb094-kube-api-access-6kfdv\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.620639 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfm7z\" (UniqueName: \"kubernetes.io/projected/8a05d77b-68a0-4e96-b71e-f5168ee4d38f-kube-api-access-bfm7z\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.620648 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8f2q\" (UniqueName: \"kubernetes.io/projected/b2c59da0-4314-401f-9b47-cd3df45f4d26-kube-api-access-h8f2q\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.620657 4766 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f839f7fc-1e7f-4c71-9c02-f456ffacb094-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.620969 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" podUID="871504f6-fe8a-4ac1-a97d-bb4125c99736" containerName="route-controller-manager" containerID="cri-o://3f515691600afc71f1ba9dfb560c811b3318a90b22a17a399e21fe7ec55d0962" gracePeriod=30 Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.690359 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fee64f2-1434-4082-9674-088e4d93cb9a" (UID: "9fee64f2-1434-4082-9674-088e4d93cb9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.720042 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wccgw"] Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.722272 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fee64f2-1434-4082-9674-088e4d93cb9a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.776259 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" event={"ID":"b4111b91-5da1-4878-bcd4-2f2b34174e52","Type":"ContainerStarted","Data":"2d55c1e3ebc26dc2bae98717842eca0eae668721a414aebbc83cbf0d92276544"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.789150 4766 generic.go:334] "Generic (PLEG): container finished" podID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerID="3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e" exitCode=0 Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.789253 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnkzl" event={"ID":"9fee64f2-1434-4082-9674-088e4d93cb9a","Type":"ContainerDied","Data":"3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.789275 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnkzl" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.789305 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnkzl" event={"ID":"9fee64f2-1434-4082-9674-088e4d93cb9a","Type":"ContainerDied","Data":"30410c135c8bfa974e3f7889a8f69689180eabf8b7f4f60535ae276c1dd1d1a9"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.789323 4766 scope.go:117] "RemoveContainer" containerID="3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.809446 4766 generic.go:334] "Generic (PLEG): container finished" podID="871504f6-fe8a-4ac1-a97d-bb4125c99736" containerID="3f515691600afc71f1ba9dfb560c811b3318a90b22a17a399e21fe7ec55d0962" exitCode=0 Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.809551 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" event={"ID":"871504f6-fe8a-4ac1-a97d-bb4125c99736","Type":"ContainerDied","Data":"3f515691600afc71f1ba9dfb560c811b3318a90b22a17a399e21fe7ec55d0962"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.825540 4766 generic.go:334] "Generic (PLEG): container finished" podID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerID="b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34" exitCode=0 Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.825665 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vt2b" event={"ID":"dd777685-39e7-46bb-824f-f19ceaa179ca","Type":"ContainerDied","Data":"b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.825706 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vt2b" event={"ID":"dd777685-39e7-46bb-824f-f19ceaa179ca","Type":"ContainerDied","Data":"2f2f711b747d17009217303fe11c2e2cdc3f1d8fd0cc5d4af7d4166289d930a6"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.825802 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vt2b" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.832841 4766 scope.go:117] "RemoveContainer" containerID="c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.855439 4766 generic.go:334] "Generic (PLEG): container finished" podID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerID="c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6" exitCode=0 Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.855526 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6mbs" event={"ID":"8a05d77b-68a0-4e96-b71e-f5168ee4d38f","Type":"ContainerDied","Data":"c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.855550 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6mbs" event={"ID":"8a05d77b-68a0-4e96-b71e-f5168ee4d38f","Type":"ContainerDied","Data":"55040f3463f3b9b8a2b1a8bf0a19c46d78109380db06d5b65cc9344f2237e8af"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.855634 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6mbs" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.867618 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnkzl"] Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.883127 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qnkzl"] Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.891058 4766 generic.go:334] "Generic (PLEG): container finished" podID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerID="7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90" exitCode=0 Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.891151 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxzck" event={"ID":"b2c59da0-4314-401f-9b47-cd3df45f4d26","Type":"ContainerDied","Data":"7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.891196 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxzck" event={"ID":"b2c59da0-4314-401f-9b47-cd3df45f4d26","Type":"ContainerDied","Data":"8c315f27d05f11ea8219aed9b0e9a1f2ff87e187ad430bd868f9ec46a9e5b24b"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.891328 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxzck" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.906876 4766 generic.go:334] "Generic (PLEG): container finished" podID="11a7d2c4-1941-48d4-9285-210f025702f6" containerID="9ca823d62854368c6a1e1c15652500e590b0f42d6a821c6e8ba8f697f182b435" exitCode=0 Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.906960 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" event={"ID":"11a7d2c4-1941-48d4-9285-210f025702f6","Type":"ContainerDied","Data":"9ca823d62854368c6a1e1c15652500e590b0f42d6a821c6e8ba8f697f182b435"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.913087 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vt2b"] Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.931627 4766 scope.go:117] "RemoveContainer" containerID="4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.940248 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6vt2b"] Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.943848 4766 generic.go:334] "Generic (PLEG): container finished" podID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerID="a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5" exitCode=0 Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.943911 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" event={"ID":"f839f7fc-1e7f-4c71-9c02-f456ffacb094","Type":"ContainerDied","Data":"a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.943945 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" event={"ID":"f839f7fc-1e7f-4c71-9c02-f456ffacb094","Type":"ContainerDied","Data":"e81a896603ec654897241fb82280ae101b599b6642dab0dcfbb32453140140ec"} Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.944041 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-swbbr" Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.958119 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6mbs"] Dec 09 03:18:13 crc kubenswrapper[4766]: I1209 03:18:13.961245 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j6mbs"] Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.007130 4766 scope.go:117] "RemoveContainer" containerID="3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.010639 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e\": container with ID starting with 3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e not found: ID does not exist" containerID="3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.010772 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e"} err="failed to get container status \"3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e\": rpc error: code = NotFound desc = could not find container \"3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e\": container with ID starting with 3b3a182d47d97b00807ccc24017e69e7a645e6f88a882f8b35e1810ccc55927e not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.010936 4766 scope.go:117] "RemoveContainer" containerID="c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.011373 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13\": container with ID starting with c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13 not found: ID does not exist" containerID="c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.011458 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13"} err="failed to get container status \"c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13\": rpc error: code = NotFound desc = could not find container \"c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13\": container with ID starting with c9eca46dd786478ff7b495c1cb40d4748ff512e6fe4522c3203e8deb8511ce13 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.011533 4766 scope.go:117] "RemoveContainer" containerID="4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.012114 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883\": container with ID starting with 4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883 not found: ID does not exist" containerID="4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.012175 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883"} err="failed to get container status \"4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883\": rpc error: code = NotFound desc = could not find container \"4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883\": container with ID starting with 4e618336bae4c150589d508b50130cf1edc751d661d7c1476ec9330c5f5a1883 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.012269 4766 scope.go:117] "RemoveContainer" containerID="b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.014762 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxzck"] Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.020359 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxzck"] Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.034153 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swbbr"] Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.041187 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swbbr"] Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.074262 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.088083 4766 scope.go:117] "RemoveContainer" containerID="e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.129347 4766 scope.go:117] "RemoveContainer" containerID="082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.146169 4766 scope.go:117] "RemoveContainer" containerID="b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.146647 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34\": container with ID starting with b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34 not found: ID does not exist" containerID="b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.146692 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34"} err="failed to get container status \"b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34\": rpc error: code = NotFound desc = could not find container \"b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34\": container with ID starting with b9fc9d5f602eeea6f335bd418bee7f3d36f7f565dee1a0ade3436c5e3b3f4b34 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.146714 4766 scope.go:117] "RemoveContainer" containerID="e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.146972 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6\": container with ID starting with e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6 not found: ID does not exist" containerID="e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.147015 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6"} err="failed to get container status \"e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6\": rpc error: code = NotFound desc = could not find container \"e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6\": container with ID starting with e3cbbd388fcd6911c452a2b1a126576dd35f471c2700afd43035df0be150d6a6 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.147034 4766 scope.go:117] "RemoveContainer" containerID="082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.147514 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e\": container with ID starting with 082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e not found: ID does not exist" containerID="082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.147533 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e"} err="failed to get container status \"082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e\": rpc error: code = NotFound desc = could not find container \"082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e\": container with ID starting with 082095ce1ba7dd0eba36d50bc3ce6b4241bd8bf3083f58133b9d846725b5cd0e not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.147576 4766 scope.go:117] "RemoveContainer" containerID="c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.158030 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.170954 4766 scope.go:117] "RemoveContainer" containerID="9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.195241 4766 scope.go:117] "RemoveContainer" containerID="b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.214705 4766 scope.go:117] "RemoveContainer" containerID="c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.215173 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6\": container with ID starting with c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6 not found: ID does not exist" containerID="c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.215248 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6"} err="failed to get container status \"c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6\": rpc error: code = NotFound desc = could not find container \"c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6\": container with ID starting with c9d04ecc50b492a489d3ae405daa4b8356814ecd68d54ea9324512f963ba90f6 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.215285 4766 scope.go:117] "RemoveContainer" containerID="9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.215834 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c\": container with ID starting with 9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c not found: ID does not exist" containerID="9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.215882 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c"} err="failed to get container status \"9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c\": rpc error: code = NotFound desc = could not find container \"9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c\": container with ID starting with 9a494045cfc485527cf4245c4fe7c3c103fe0363a5f1d81e5e05875d5be05d6c not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.215913 4766 scope.go:117] "RemoveContainer" containerID="b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.216321 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0\": container with ID starting with b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0 not found: ID does not exist" containerID="b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.216384 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0"} err="failed to get container status \"b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0\": rpc error: code = NotFound desc = could not find container \"b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0\": container with ID starting with b2842ae36919b171fa2da10977416ab3de102737753000a52243a00b71ca75d0 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.216414 4766 scope.go:117] "RemoveContainer" containerID="7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.229091 4766 scope.go:117] "RemoveContainer" containerID="300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.237524 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bflm\" (UniqueName: \"kubernetes.io/projected/11a7d2c4-1941-48d4-9285-210f025702f6-kube-api-access-8bflm\") pod \"11a7d2c4-1941-48d4-9285-210f025702f6\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.237576 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8k4q\" (UniqueName: \"kubernetes.io/projected/871504f6-fe8a-4ac1-a97d-bb4125c99736-kube-api-access-c8k4q\") pod \"871504f6-fe8a-4ac1-a97d-bb4125c99736\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.237607 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-client-ca\") pod \"871504f6-fe8a-4ac1-a97d-bb4125c99736\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.237652 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-proxy-ca-bundles\") pod \"11a7d2c4-1941-48d4-9285-210f025702f6\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.237681 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11a7d2c4-1941-48d4-9285-210f025702f6-serving-cert\") pod \"11a7d2c4-1941-48d4-9285-210f025702f6\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.237715 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-config\") pod \"11a7d2c4-1941-48d4-9285-210f025702f6\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.237738 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/871504f6-fe8a-4ac1-a97d-bb4125c99736-serving-cert\") pod \"871504f6-fe8a-4ac1-a97d-bb4125c99736\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.237769 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-client-ca\") pod \"11a7d2c4-1941-48d4-9285-210f025702f6\" (UID: \"11a7d2c4-1941-48d4-9285-210f025702f6\") " Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.237796 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-config\") pod \"871504f6-fe8a-4ac1-a97d-bb4125c99736\" (UID: \"871504f6-fe8a-4ac1-a97d-bb4125c99736\") " Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.238485 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-config" (OuterVolumeSpecName: "config") pod "871504f6-fe8a-4ac1-a97d-bb4125c99736" (UID: "871504f6-fe8a-4ac1-a97d-bb4125c99736"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.238915 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-config" (OuterVolumeSpecName: "config") pod "11a7d2c4-1941-48d4-9285-210f025702f6" (UID: "11a7d2c4-1941-48d4-9285-210f025702f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.239251 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-client-ca" (OuterVolumeSpecName: "client-ca") pod "871504f6-fe8a-4ac1-a97d-bb4125c99736" (UID: "871504f6-fe8a-4ac1-a97d-bb4125c99736"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.239593 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-client-ca" (OuterVolumeSpecName: "client-ca") pod "11a7d2c4-1941-48d4-9285-210f025702f6" (UID: "11a7d2c4-1941-48d4-9285-210f025702f6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.239929 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "11a7d2c4-1941-48d4-9285-210f025702f6" (UID: "11a7d2c4-1941-48d4-9285-210f025702f6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.242250 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a7d2c4-1941-48d4-9285-210f025702f6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "11a7d2c4-1941-48d4-9285-210f025702f6" (UID: "11a7d2c4-1941-48d4-9285-210f025702f6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.242479 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871504f6-fe8a-4ac1-a97d-bb4125c99736-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "871504f6-fe8a-4ac1-a97d-bb4125c99736" (UID: "871504f6-fe8a-4ac1-a97d-bb4125c99736"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.242500 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/871504f6-fe8a-4ac1-a97d-bb4125c99736-kube-api-access-c8k4q" (OuterVolumeSpecName: "kube-api-access-c8k4q") pod "871504f6-fe8a-4ac1-a97d-bb4125c99736" (UID: "871504f6-fe8a-4ac1-a97d-bb4125c99736"). InnerVolumeSpecName "kube-api-access-c8k4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.243795 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a7d2c4-1941-48d4-9285-210f025702f6-kube-api-access-8bflm" (OuterVolumeSpecName: "kube-api-access-8bflm") pod "11a7d2c4-1941-48d4-9285-210f025702f6" (UID: "11a7d2c4-1941-48d4-9285-210f025702f6"). InnerVolumeSpecName "kube-api-access-8bflm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.246374 4766 scope.go:117] "RemoveContainer" containerID="779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.261197 4766 scope.go:117] "RemoveContainer" containerID="7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.262240 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90\": container with ID starting with 7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90 not found: ID does not exist" containerID="7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.262286 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90"} err="failed to get container status \"7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90\": rpc error: code = NotFound desc = could not find container \"7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90\": container with ID starting with 7105cb4af8f0113c29002f85e05be5562a5d9d70f4adea9d08e5b27a96ea1a90 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.262311 4766 scope.go:117] "RemoveContainer" containerID="300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.262684 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c\": container with ID starting with 300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c not found: ID does not exist" containerID="300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.262730 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c"} err="failed to get container status \"300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c\": rpc error: code = NotFound desc = could not find container \"300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c\": container with ID starting with 300f19f617cc33b4baf552fcdc4a24757a5402f6ff9ee76013b4aacb3880882c not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.262761 4766 scope.go:117] "RemoveContainer" containerID="779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.263146 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8\": container with ID starting with 779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8 not found: ID does not exist" containerID="779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.263174 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8"} err="failed to get container status \"779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8\": rpc error: code = NotFound desc = could not find container \"779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8\": container with ID starting with 779f7bd98be2017596b3664b7854daecfca34650bbc877983c204914b757c5f8 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.263189 4766 scope.go:117] "RemoveContainer" containerID="a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.273499 4766 scope.go:117] "RemoveContainer" containerID="12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.288274 4766 scope.go:117] "RemoveContainer" containerID="a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.290395 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5\": container with ID starting with a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5 not found: ID does not exist" containerID="a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.290427 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5"} err="failed to get container status \"a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5\": rpc error: code = NotFound desc = could not find container \"a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5\": container with ID starting with a26c1a738195477f6d8eb1fca6ba55441cd7cf755339f52f4750073c5dd631d5 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.290451 4766 scope.go:117] "RemoveContainer" containerID="12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.290819 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60\": container with ID starting with 12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60 not found: ID does not exist" containerID="12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.290843 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60"} err="failed to get container status \"12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60\": rpc error: code = NotFound desc = could not find container \"12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60\": container with ID starting with 12dd0662277f57dc890f8b04f2567c1c8857b24f591ba88a05edbf85a1812b60 not found: ID does not exist" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.339152 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.339186 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.339198 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bflm\" (UniqueName: \"kubernetes.io/projected/11a7d2c4-1941-48d4-9285-210f025702f6-kube-api-access-8bflm\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.339225 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8k4q\" (UniqueName: \"kubernetes.io/projected/871504f6-fe8a-4ac1-a97d-bb4125c99736-kube-api-access-c8k4q\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.339234 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/871504f6-fe8a-4ac1-a97d-bb4125c99736-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.339242 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.339251 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11a7d2c4-1941-48d4-9285-210f025702f6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.339259 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11a7d2c4-1941-48d4-9285-210f025702f6-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.339268 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/871504f6-fe8a-4ac1-a97d-bb4125c99736-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621148 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dtw8c"] Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621380 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621394 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621410 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerName="extract-utilities" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621417 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerName="extract-utilities" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621429 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerName="marketplace-operator" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621437 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerName="marketplace-operator" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621447 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621455 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621469 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerName="marketplace-operator" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621477 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerName="marketplace-operator" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621490 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerName="extract-content" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621499 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerName="extract-content" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621511 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerName="extract-content" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621518 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerName="extract-content" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621527 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871504f6-fe8a-4ac1-a97d-bb4125c99736" containerName="route-controller-manager" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621536 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="871504f6-fe8a-4ac1-a97d-bb4125c99736" containerName="route-controller-manager" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621547 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerName="extract-utilities" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621554 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerName="extract-utilities" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621564 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerName="extract-content" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621572 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerName="extract-content" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621581 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a7d2c4-1941-48d4-9285-210f025702f6" containerName="controller-manager" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621589 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a7d2c4-1941-48d4-9285-210f025702f6" containerName="controller-manager" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621597 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerName="extract-content" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621606 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerName="extract-content" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621615 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621622 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621631 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621638 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621649 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerName="extract-utilities" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621656 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerName="extract-utilities" Dec 09 03:18:14 crc kubenswrapper[4766]: E1209 03:18:14.621667 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerName="extract-utilities" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621675 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerName="extract-utilities" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621778 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerName="marketplace-operator" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621790 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c59da0-4314-401f-9b47-cd3df45f4d26" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621800 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" containerName="marketplace-operator" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621810 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="871504f6-fe8a-4ac1-a97d-bb4125c99736" containerName="route-controller-manager" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621825 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd777685-39e7-46bb-824f-f19ceaa179ca" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621836 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621845 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a7d2c4-1941-48d4-9285-210f025702f6" containerName="controller-manager" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.621855 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" containerName="registry-server" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.622650 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.624470 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.636890 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtw8c"] Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.747183 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ccjr\" (UniqueName: \"kubernetes.io/projected/29393907-e416-489a-a2dc-92c75c0284bc-kube-api-access-2ccjr\") pod \"certified-operators-dtw8c\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.747282 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-catalog-content\") pod \"certified-operators-dtw8c\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.747307 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-utilities\") pod \"certified-operators-dtw8c\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.848185 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-catalog-content\") pod \"certified-operators-dtw8c\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.848754 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-utilities\") pod \"certified-operators-dtw8c\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.848832 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-catalog-content\") pod \"certified-operators-dtw8c\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.848896 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ccjr\" (UniqueName: \"kubernetes.io/projected/29393907-e416-489a-a2dc-92c75c0284bc-kube-api-access-2ccjr\") pod \"certified-operators-dtw8c\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.849117 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-utilities\") pod \"certified-operators-dtw8c\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.852031 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a05d77b-68a0-4e96-b71e-f5168ee4d38f" path="/var/lib/kubelet/pods/8a05d77b-68a0-4e96-b71e-f5168ee4d38f/volumes" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.852821 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fee64f2-1434-4082-9674-088e4d93cb9a" path="/var/lib/kubelet/pods/9fee64f2-1434-4082-9674-088e4d93cb9a/volumes" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.853526 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c59da0-4314-401f-9b47-cd3df45f4d26" path="/var/lib/kubelet/pods/b2c59da0-4314-401f-9b47-cd3df45f4d26/volumes" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.854678 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd777685-39e7-46bb-824f-f19ceaa179ca" path="/var/lib/kubelet/pods/dd777685-39e7-46bb-824f-f19ceaa179ca/volumes" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.855386 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f839f7fc-1e7f-4c71-9c02-f456ffacb094" path="/var/lib/kubelet/pods/f839f7fc-1e7f-4c71-9c02-f456ffacb094/volumes" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.875344 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ccjr\" (UniqueName: \"kubernetes.io/projected/29393907-e416-489a-a2dc-92c75c0284bc-kube-api-access-2ccjr\") pod \"certified-operators-dtw8c\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.946986 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.951737 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" event={"ID":"871504f6-fe8a-4ac1-a97d-bb4125c99736","Type":"ContainerDied","Data":"44a1f5b838e067e62ab269ea1b35eab37c9adfc0ab4db7c8f98b783bb5261789"} Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.951784 4766 scope.go:117] "RemoveContainer" containerID="3f515691600afc71f1ba9dfb560c811b3318a90b22a17a399e21fe7ec55d0962" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.951874 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.961323 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" event={"ID":"11a7d2c4-1941-48d4-9285-210f025702f6","Type":"ContainerDied","Data":"45e2fdeca0b22b897133233d692f0365a4639c91fb5b4060acfa30a86b29af0f"} Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.961854 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff9c64758-r5bsz" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.966401 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" event={"ID":"b4111b91-5da1-4878-bcd4-2f2b34174e52","Type":"ContainerStarted","Data":"b4321e28d08b748bb48f90f4ae566b72ce2a7d03bcdafcac030c4a2f84c28627"} Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.967172 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.971455 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" Dec 09 03:18:14 crc kubenswrapper[4766]: I1209 03:18:14.988641 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wccgw" podStartSLOduration=2.988589755 podStartE2EDuration="2.988589755s" podCreationTimestamp="2025-12-09 03:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:18:14.980986563 +0000 UTC m=+376.690291989" watchObservedRunningTime="2025-12-09 03:18:14.988589755 +0000 UTC m=+376.697895181" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.019707 4766 scope.go:117] "RemoveContainer" containerID="9ca823d62854368c6a1e1c15652500e590b0f42d6a821c6e8ba8f697f182b435" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.023814 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.024503 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.035776 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.037507 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.040403 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.041255 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.041903 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.043293 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.072765 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-695fc9d698-pt5m5"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.074952 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.077851 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.077982 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ff9c64758-r5bsz"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.078637 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.078793 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.079516 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.079977 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.081133 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.081828 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.086496 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7ff9c64758-r5bsz"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.088345 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.091323 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-695fc9d698-pt5m5"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.095275 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.099873 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8587bbf9b-2n9pm"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.153076 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c85ffb2a-937f-458b-b05c-186633a2265b-config\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.153158 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9b67498-9d67-4643-b30d-14fe79b2ced8-serving-cert\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.153309 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c85ffb2a-937f-458b-b05c-186633a2265b-serving-cert\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.153366 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz4d6\" (UniqueName: \"kubernetes.io/projected/c9b67498-9d67-4643-b30d-14fe79b2ced8-kube-api-access-hz4d6\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.153429 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9b67498-9d67-4643-b30d-14fe79b2ced8-client-ca\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.153460 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b67498-9d67-4643-b30d-14fe79b2ced8-config\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.153795 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mb2t\" (UniqueName: \"kubernetes.io/projected/c85ffb2a-937f-458b-b05c-186633a2265b-kube-api-access-9mb2t\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.153863 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c85ffb2a-937f-458b-b05c-186633a2265b-proxy-ca-bundles\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.153905 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c85ffb2a-937f-458b-b05c-186633a2265b-client-ca\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.163044 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtw8c"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.254648 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c85ffb2a-937f-458b-b05c-186633a2265b-proxy-ca-bundles\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.254722 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c85ffb2a-937f-458b-b05c-186633a2265b-client-ca\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.254749 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c85ffb2a-937f-458b-b05c-186633a2265b-config\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.254778 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9b67498-9d67-4643-b30d-14fe79b2ced8-serving-cert\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.254802 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c85ffb2a-937f-458b-b05c-186633a2265b-serving-cert\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.254825 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4d6\" (UniqueName: \"kubernetes.io/projected/c9b67498-9d67-4643-b30d-14fe79b2ced8-kube-api-access-hz4d6\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.254858 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9b67498-9d67-4643-b30d-14fe79b2ced8-client-ca\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.254880 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b67498-9d67-4643-b30d-14fe79b2ced8-config\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.254918 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mb2t\" (UniqueName: \"kubernetes.io/projected/c85ffb2a-937f-458b-b05c-186633a2265b-kube-api-access-9mb2t\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.256598 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c85ffb2a-937f-458b-b05c-186633a2265b-client-ca\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.256636 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c85ffb2a-937f-458b-b05c-186633a2265b-proxy-ca-bundles\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.257341 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9b67498-9d67-4643-b30d-14fe79b2ced8-client-ca\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.258414 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b67498-9d67-4643-b30d-14fe79b2ced8-config\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.258466 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c85ffb2a-937f-458b-b05c-186633a2265b-config\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.261632 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9b67498-9d67-4643-b30d-14fe79b2ced8-serving-cert\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.261655 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c85ffb2a-937f-458b-b05c-186633a2265b-serving-cert\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.269506 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mb2t\" (UniqueName: \"kubernetes.io/projected/c85ffb2a-937f-458b-b05c-186633a2265b-kube-api-access-9mb2t\") pod \"controller-manager-695fc9d698-pt5m5\" (UID: \"c85ffb2a-937f-458b-b05c-186633a2265b\") " pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.275586 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz4d6\" (UniqueName: \"kubernetes.io/projected/c9b67498-9d67-4643-b30d-14fe79b2ced8-kube-api-access-hz4d6\") pod \"route-controller-manager-8945d6fbc-pkg7q\" (UID: \"c9b67498-9d67-4643-b30d-14fe79b2ced8\") " pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.392205 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.401348 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.805409 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-695fc9d698-pt5m5"] Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.808195 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q"] Dec 09 03:18:15 crc kubenswrapper[4766]: W1209 03:18:15.813995 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b67498_9d67_4643_b30d_14fe79b2ced8.slice/crio-54a213df9ef8318addaa5b6a2955c67678e79cc92cd3eec084c76af9596886ea WatchSource:0}: Error finding container 54a213df9ef8318addaa5b6a2955c67678e79cc92cd3eec084c76af9596886ea: Status 404 returned error can't find the container with id 54a213df9ef8318addaa5b6a2955c67678e79cc92cd3eec084c76af9596886ea Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.974043 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" event={"ID":"c85ffb2a-937f-458b-b05c-186633a2265b","Type":"ContainerStarted","Data":"6b6f3d88dc459c47c97044c7b2ee6d7a4b9aa33af705d084b55ca7284694bcb9"} Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.974394 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" event={"ID":"c85ffb2a-937f-458b-b05c-186633a2265b","Type":"ContainerStarted","Data":"5cd510e064b968acae67d8a49d66dffa626da4e1780346296666ac93ea4851c7"} Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.974606 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.976023 4766 patch_prober.go:28] interesting pod/controller-manager-695fc9d698-pt5m5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.976067 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" podUID="c85ffb2a-937f-458b-b05c-186633a2265b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.976470 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" event={"ID":"c9b67498-9d67-4643-b30d-14fe79b2ced8","Type":"ContainerStarted","Data":"9e7f26bf6dfc902bf148df51a32cf9b0e2086dddcfddc52e356366fecce3a69c"} Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.976497 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" event={"ID":"c9b67498-9d67-4643-b30d-14fe79b2ced8","Type":"ContainerStarted","Data":"54a213df9ef8318addaa5b6a2955c67678e79cc92cd3eec084c76af9596886ea"} Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.976996 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.978940 4766 patch_prober.go:28] interesting pod/route-controller-manager-8945d6fbc-pkg7q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.978981 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" podUID="c9b67498-9d67-4643-b30d-14fe79b2ced8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.982975 4766 generic.go:334] "Generic (PLEG): container finished" podID="29393907-e416-489a-a2dc-92c75c0284bc" containerID="a0e3a172c4a5a365bb6d1d7ad57a35d73a3bc9eee9bdb30ed17b28188086138e" exitCode=0 Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.983653 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtw8c" event={"ID":"29393907-e416-489a-a2dc-92c75c0284bc","Type":"ContainerDied","Data":"a0e3a172c4a5a365bb6d1d7ad57a35d73a3bc9eee9bdb30ed17b28188086138e"} Dec 09 03:18:15 crc kubenswrapper[4766]: I1209 03:18:15.983684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtw8c" event={"ID":"29393907-e416-489a-a2dc-92c75c0284bc","Type":"ContainerStarted","Data":"442770a7a89a738f28cd82445385befbc675dc6d7043ca14fb93d536f8b01617"} Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.011709 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" podStartSLOduration=3.011679646 podStartE2EDuration="3.011679646s" podCreationTimestamp="2025-12-09 03:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:18:15.998383474 +0000 UTC m=+377.707688910" watchObservedRunningTime="2025-12-09 03:18:16.011679646 +0000 UTC m=+377.720985062" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.014202 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s4fw9"] Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.015348 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.024945 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.031600 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4fw9"] Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.063769 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" podStartSLOduration=3.063751773 podStartE2EDuration="3.063751773s" podCreationTimestamp="2025-12-09 03:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:18:16.046129727 +0000 UTC m=+377.755435173" watchObservedRunningTime="2025-12-09 03:18:16.063751773 +0000 UTC m=+377.773057189" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.165926 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46k2g\" (UniqueName: \"kubernetes.io/projected/81c2824b-d083-4153-ad95-e8629b600175-kube-api-access-46k2g\") pod \"redhat-marketplace-s4fw9\" (UID: \"81c2824b-d083-4153-ad95-e8629b600175\") " pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.166056 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c2824b-d083-4153-ad95-e8629b600175-catalog-content\") pod \"redhat-marketplace-s4fw9\" (UID: \"81c2824b-d083-4153-ad95-e8629b600175\") " pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.166107 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c2824b-d083-4153-ad95-e8629b600175-utilities\") pod \"redhat-marketplace-s4fw9\" (UID: \"81c2824b-d083-4153-ad95-e8629b600175\") " pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.266957 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c2824b-d083-4153-ad95-e8629b600175-utilities\") pod \"redhat-marketplace-s4fw9\" (UID: \"81c2824b-d083-4153-ad95-e8629b600175\") " pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.267026 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46k2g\" (UniqueName: \"kubernetes.io/projected/81c2824b-d083-4153-ad95-e8629b600175-kube-api-access-46k2g\") pod \"redhat-marketplace-s4fw9\" (UID: \"81c2824b-d083-4153-ad95-e8629b600175\") " pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.267072 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c2824b-d083-4153-ad95-e8629b600175-catalog-content\") pod \"redhat-marketplace-s4fw9\" (UID: \"81c2824b-d083-4153-ad95-e8629b600175\") " pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.267528 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c2824b-d083-4153-ad95-e8629b600175-catalog-content\") pod \"redhat-marketplace-s4fw9\" (UID: \"81c2824b-d083-4153-ad95-e8629b600175\") " pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.267797 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c2824b-d083-4153-ad95-e8629b600175-utilities\") pod \"redhat-marketplace-s4fw9\" (UID: \"81c2824b-d083-4153-ad95-e8629b600175\") " pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.293192 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46k2g\" (UniqueName: \"kubernetes.io/projected/81c2824b-d083-4153-ad95-e8629b600175-kube-api-access-46k2g\") pod \"redhat-marketplace-s4fw9\" (UID: \"81c2824b-d083-4153-ad95-e8629b600175\") " pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.336416 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.564642 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4fw9"] Dec 09 03:18:16 crc kubenswrapper[4766]: W1209 03:18:16.573564 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c2824b_d083_4153_ad95_e8629b600175.slice/crio-87005f8e8f88da3d9671101466dc83ab22f75a333231adb81e8ad4fc68d2b00c WatchSource:0}: Error finding container 87005f8e8f88da3d9671101466dc83ab22f75a333231adb81e8ad4fc68d2b00c: Status 404 returned error can't find the container with id 87005f8e8f88da3d9671101466dc83ab22f75a333231adb81e8ad4fc68d2b00c Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.849258 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a7d2c4-1941-48d4-9285-210f025702f6" path="/var/lib/kubelet/pods/11a7d2c4-1941-48d4-9285-210f025702f6/volumes" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.850993 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="871504f6-fe8a-4ac1-a97d-bb4125c99736" path="/var/lib/kubelet/pods/871504f6-fe8a-4ac1-a97d-bb4125c99736/volumes" Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.992887 4766 generic.go:334] "Generic (PLEG): container finished" podID="81c2824b-d083-4153-ad95-e8629b600175" containerID="8a8477969debc388017978cc29e7568a141d7ef702b272fd94aee7933726c21a" exitCode=0 Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.992934 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4fw9" event={"ID":"81c2824b-d083-4153-ad95-e8629b600175","Type":"ContainerDied","Data":"8a8477969debc388017978cc29e7568a141d7ef702b272fd94aee7933726c21a"} Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.993034 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4fw9" event={"ID":"81c2824b-d083-4153-ad95-e8629b600175","Type":"ContainerStarted","Data":"87005f8e8f88da3d9671101466dc83ab22f75a333231adb81e8ad4fc68d2b00c"} Dec 09 03:18:16 crc kubenswrapper[4766]: I1209 03:18:16.999380 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-695fc9d698-pt5m5" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.006480 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8945d6fbc-pkg7q" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.042981 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2gxpb"] Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.050514 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.053751 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.066358 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gxpb"] Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.182528 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7pmd\" (UniqueName: \"kubernetes.io/projected/9fe686c1-516d-43be-8a61-776ff1f64cd1-kube-api-access-p7pmd\") pod \"redhat-operators-2gxpb\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.182853 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-utilities\") pod \"redhat-operators-2gxpb\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.182902 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-catalog-content\") pod \"redhat-operators-2gxpb\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.284650 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-catalog-content\") pod \"redhat-operators-2gxpb\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.284741 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7pmd\" (UniqueName: \"kubernetes.io/projected/9fe686c1-516d-43be-8a61-776ff1f64cd1-kube-api-access-p7pmd\") pod \"redhat-operators-2gxpb\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.285394 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-catalog-content\") pod \"redhat-operators-2gxpb\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.285477 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-utilities\") pod \"redhat-operators-2gxpb\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.285750 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-utilities\") pod \"redhat-operators-2gxpb\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.307387 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7pmd\" (UniqueName: \"kubernetes.io/projected/9fe686c1-516d-43be-8a61-776ff1f64cd1-kube-api-access-p7pmd\") pod \"redhat-operators-2gxpb\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.402025 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:17 crc kubenswrapper[4766]: I1209 03:18:17.804389 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gxpb"] Dec 09 03:18:17 crc kubenswrapper[4766]: W1209 03:18:17.809301 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fe686c1_516d_43be_8a61_776ff1f64cd1.slice/crio-c4b4d013761a9f789468d9a048656640d1b995496ec2b4d81c328121d8fbc4b1 WatchSource:0}: Error finding container c4b4d013761a9f789468d9a048656640d1b995496ec2b4d81c328121d8fbc4b1: Status 404 returned error can't find the container with id c4b4d013761a9f789468d9a048656640d1b995496ec2b4d81c328121d8fbc4b1 Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.001105 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4fw9" event={"ID":"81c2824b-d083-4153-ad95-e8629b600175","Type":"ContainerStarted","Data":"0900d14e84ee22af598d79ad92b2c98d53f82861a28e6bc3f998e541209e06d1"} Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.002507 4766 generic.go:334] "Generic (PLEG): container finished" podID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerID="f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6" exitCode=0 Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.002558 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gxpb" event={"ID":"9fe686c1-516d-43be-8a61-776ff1f64cd1","Type":"ContainerDied","Data":"f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6"} Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.002584 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gxpb" event={"ID":"9fe686c1-516d-43be-8a61-776ff1f64cd1","Type":"ContainerStarted","Data":"c4b4d013761a9f789468d9a048656640d1b995496ec2b4d81c328121d8fbc4b1"} Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.421078 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pt6rh"] Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.422808 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.427549 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.432161 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pt6rh"] Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.500539 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-utilities\") pod \"community-operators-pt6rh\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.500585 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-catalog-content\") pod \"community-operators-pt6rh\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.500608 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2rdg\" (UniqueName: \"kubernetes.io/projected/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-kube-api-access-f2rdg\") pod \"community-operators-pt6rh\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.602308 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-utilities\") pod \"community-operators-pt6rh\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.602380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-catalog-content\") pod \"community-operators-pt6rh\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.602422 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2rdg\" (UniqueName: \"kubernetes.io/projected/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-kube-api-access-f2rdg\") pod \"community-operators-pt6rh\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.602826 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-utilities\") pod \"community-operators-pt6rh\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.602865 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-catalog-content\") pod \"community-operators-pt6rh\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.625171 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2rdg\" (UniqueName: \"kubernetes.io/projected/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-kube-api-access-f2rdg\") pod \"community-operators-pt6rh\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:18 crc kubenswrapper[4766]: I1209 03:18:18.747298 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:19 crc kubenswrapper[4766]: I1209 03:18:19.014068 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gxpb" event={"ID":"9fe686c1-516d-43be-8a61-776ff1f64cd1","Type":"ContainerStarted","Data":"46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74"} Dec 09 03:18:19 crc kubenswrapper[4766]: I1209 03:18:19.021147 4766 generic.go:334] "Generic (PLEG): container finished" podID="81c2824b-d083-4153-ad95-e8629b600175" containerID="0900d14e84ee22af598d79ad92b2c98d53f82861a28e6bc3f998e541209e06d1" exitCode=0 Dec 09 03:18:19 crc kubenswrapper[4766]: I1209 03:18:19.021240 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4fw9" event={"ID":"81c2824b-d083-4153-ad95-e8629b600175","Type":"ContainerDied","Data":"0900d14e84ee22af598d79ad92b2c98d53f82861a28e6bc3f998e541209e06d1"} Dec 09 03:18:19 crc kubenswrapper[4766]: I1209 03:18:19.170347 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pt6rh"] Dec 09 03:18:19 crc kubenswrapper[4766]: W1209 03:18:19.178174 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aadd3b6_2c6f_414f_ab7f_bbd82c568689.slice/crio-d43a8e29a16cc718f389ee21f5d0068b5f6576e1112bf9c0d4fd480000dd2d4c WatchSource:0}: Error finding container d43a8e29a16cc718f389ee21f5d0068b5f6576e1112bf9c0d4fd480000dd2d4c: Status 404 returned error can't find the container with id d43a8e29a16cc718f389ee21f5d0068b5f6576e1112bf9c0d4fd480000dd2d4c Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.029988 4766 generic.go:334] "Generic (PLEG): container finished" podID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerID="a3adaba59234c56d59bd1503ffaee5a99a9c6fbfd741007f9f95851f40862df5" exitCode=0 Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.030083 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pt6rh" event={"ID":"8aadd3b6-2c6f-414f-ab7f-bbd82c568689","Type":"ContainerDied","Data":"a3adaba59234c56d59bd1503ffaee5a99a9c6fbfd741007f9f95851f40862df5"} Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.032708 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pt6rh" event={"ID":"8aadd3b6-2c6f-414f-ab7f-bbd82c568689","Type":"ContainerStarted","Data":"d43a8e29a16cc718f389ee21f5d0068b5f6576e1112bf9c0d4fd480000dd2d4c"} Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.037053 4766 generic.go:334] "Generic (PLEG): container finished" podID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerID="46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74" exitCode=0 Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.037105 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gxpb" event={"ID":"9fe686c1-516d-43be-8a61-776ff1f64cd1","Type":"ContainerDied","Data":"46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74"} Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.040502 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4fw9" event={"ID":"81c2824b-d083-4153-ad95-e8629b600175","Type":"ContainerStarted","Data":"0a80ee93d08032cc29cb88193baf06a474fdc09293d28935e8cb9607daa6a825"} Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.092409 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s4fw9" podStartSLOduration=2.647985693 podStartE2EDuration="5.092381675s" podCreationTimestamp="2025-12-09 03:18:15 +0000 UTC" firstStartedPulling="2025-12-09 03:18:16.995047198 +0000 UTC m=+378.704352644" lastFinishedPulling="2025-12-09 03:18:19.4394432 +0000 UTC m=+381.148748626" observedRunningTime="2025-12-09 03:18:20.087525217 +0000 UTC m=+381.796830653" watchObservedRunningTime="2025-12-09 03:18:20.092381675 +0000 UTC m=+381.801687101" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.694931 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-885kq"] Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.696358 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.712996 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-885kq"] Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.738969 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-registry-certificates\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.739014 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.739090 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.739125 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-trusted-ca\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.739150 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-registry-tls\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.739175 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.739193 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k6fq\" (UniqueName: \"kubernetes.io/projected/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-kube-api-access-5k6fq\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.739227 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-bound-sa-token\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.799815 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.840149 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-trusted-ca\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.840220 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-registry-tls\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.840249 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.840268 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k6fq\" (UniqueName: \"kubernetes.io/projected/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-kube-api-access-5k6fq\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.840287 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-bound-sa-token\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.840310 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-registry-certificates\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.840327 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.841146 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.841611 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-trusted-ca\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.842560 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-registry-certificates\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.851513 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.851926 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-registry-tls\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.869968 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k6fq\" (UniqueName: \"kubernetes.io/projected/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-kube-api-access-5k6fq\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:20 crc kubenswrapper[4766]: I1209 03:18:20.887743 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fb6673c-0caf-4a1a-87a0-9353799d6c0b-bound-sa-token\") pod \"image-registry-66df7c8f76-885kq\" (UID: \"2fb6673c-0caf-4a1a-87a0-9353799d6c0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:21 crc kubenswrapper[4766]: I1209 03:18:21.016618 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:21 crc kubenswrapper[4766]: I1209 03:18:21.049499 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtw8c" event={"ID":"29393907-e416-489a-a2dc-92c75c0284bc","Type":"ContainerStarted","Data":"94dc6cd5e89548729ec64cf815e3a0890f89af2d048150c079f259fd5331f25d"} Dec 09 03:18:21 crc kubenswrapper[4766]: I1209 03:18:21.434666 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-885kq"] Dec 09 03:18:22 crc kubenswrapper[4766]: I1209 03:18:22.059656 4766 generic.go:334] "Generic (PLEG): container finished" podID="29393907-e416-489a-a2dc-92c75c0284bc" containerID="94dc6cd5e89548729ec64cf815e3a0890f89af2d048150c079f259fd5331f25d" exitCode=0 Dec 09 03:18:22 crc kubenswrapper[4766]: I1209 03:18:22.059719 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtw8c" event={"ID":"29393907-e416-489a-a2dc-92c75c0284bc","Type":"ContainerDied","Data":"94dc6cd5e89548729ec64cf815e3a0890f89af2d048150c079f259fd5331f25d"} Dec 09 03:18:22 crc kubenswrapper[4766]: I1209 03:18:22.061577 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-885kq" event={"ID":"2fb6673c-0caf-4a1a-87a0-9353799d6c0b","Type":"ContainerStarted","Data":"5320dc00d809a3168f526a2cc5d2b92542d0aaf8ca20fbbab55ddaf871e77c46"} Dec 09 03:18:22 crc kubenswrapper[4766]: I1209 03:18:22.063902 4766 generic.go:334] "Generic (PLEG): container finished" podID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerID="8b65800fc7209b610b22493e9a516d034b0fbcb7eeb84e882ee346fc3a20ed51" exitCode=0 Dec 09 03:18:22 crc kubenswrapper[4766]: I1209 03:18:22.063935 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pt6rh" event={"ID":"8aadd3b6-2c6f-414f-ab7f-bbd82c568689","Type":"ContainerDied","Data":"8b65800fc7209b610b22493e9a516d034b0fbcb7eeb84e882ee346fc3a20ed51"} Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.075046 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtw8c" event={"ID":"29393907-e416-489a-a2dc-92c75c0284bc","Type":"ContainerStarted","Data":"867fa31cf6551b176bf9cc80b422b475f6916b09bd067f96cb1b571d54fbfaae"} Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.076775 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-885kq" event={"ID":"2fb6673c-0caf-4a1a-87a0-9353799d6c0b","Type":"ContainerStarted","Data":"c5bb08141282952090263dfa6cc2efe9801656ed4525d5a1cce6243411854adc"} Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.076882 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.078471 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pt6rh" event={"ID":"8aadd3b6-2c6f-414f-ab7f-bbd82c568689","Type":"ContainerStarted","Data":"5f99dbbc13ee0489976e6eee73f09e7f039e218f91aee1be5a7940a48887d3fa"} Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.082010 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gxpb" event={"ID":"9fe686c1-516d-43be-8a61-776ff1f64cd1","Type":"ContainerStarted","Data":"3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144"} Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.097760 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dtw8c" podStartSLOduration=2.466566774 podStartE2EDuration="10.097742322s" podCreationTimestamp="2025-12-09 03:18:14 +0000 UTC" firstStartedPulling="2025-12-09 03:18:15.984288102 +0000 UTC m=+377.693593528" lastFinishedPulling="2025-12-09 03:18:23.61546365 +0000 UTC m=+385.324769076" observedRunningTime="2025-12-09 03:18:24.094879466 +0000 UTC m=+385.804184902" watchObservedRunningTime="2025-12-09 03:18:24.097742322 +0000 UTC m=+385.807047748" Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.120156 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-885kq" podStartSLOduration=4.120137554 podStartE2EDuration="4.120137554s" podCreationTimestamp="2025-12-09 03:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:18:24.115932833 +0000 UTC m=+385.825238259" watchObservedRunningTime="2025-12-09 03:18:24.120137554 +0000 UTC m=+385.829442980" Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.138121 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pt6rh" podStartSLOduration=2.458386353 podStartE2EDuration="6.13810499s" podCreationTimestamp="2025-12-09 03:18:18 +0000 UTC" firstStartedPulling="2025-12-09 03:18:20.032514872 +0000 UTC m=+381.741820298" lastFinishedPulling="2025-12-09 03:18:23.712233509 +0000 UTC m=+385.421538935" observedRunningTime="2025-12-09 03:18:24.136570069 +0000 UTC m=+385.845875495" watchObservedRunningTime="2025-12-09 03:18:24.13810499 +0000 UTC m=+385.847410416" Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.170376 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2gxpb" podStartSLOduration=3.672043652 podStartE2EDuration="7.170354992s" podCreationTimestamp="2025-12-09 03:18:17 +0000 UTC" firstStartedPulling="2025-12-09 03:18:18.003473532 +0000 UTC m=+379.712778958" lastFinishedPulling="2025-12-09 03:18:21.501784872 +0000 UTC m=+383.211090298" observedRunningTime="2025-12-09 03:18:24.168607736 +0000 UTC m=+385.877913172" watchObservedRunningTime="2025-12-09 03:18:24.170354992 +0000 UTC m=+385.879660418" Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.947722 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:24 crc kubenswrapper[4766]: I1209 03:18:24.948097 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:25 crc kubenswrapper[4766]: I1209 03:18:25.986640 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dtw8c" podUID="29393907-e416-489a-a2dc-92c75c0284bc" containerName="registry-server" probeResult="failure" output=< Dec 09 03:18:25 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 03:18:25 crc kubenswrapper[4766]: > Dec 09 03:18:26 crc kubenswrapper[4766]: I1209 03:18:26.337308 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:26 crc kubenswrapper[4766]: I1209 03:18:26.337726 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:26 crc kubenswrapper[4766]: I1209 03:18:26.380791 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:27 crc kubenswrapper[4766]: I1209 03:18:27.144600 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s4fw9" Dec 09 03:18:27 crc kubenswrapper[4766]: I1209 03:18:27.403482 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:27 crc kubenswrapper[4766]: I1209 03:18:27.403533 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:28 crc kubenswrapper[4766]: I1209 03:18:28.459910 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2gxpb" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerName="registry-server" probeResult="failure" output=< Dec 09 03:18:28 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 03:18:28 crc kubenswrapper[4766]: > Dec 09 03:18:28 crc kubenswrapper[4766]: I1209 03:18:28.747795 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:28 crc kubenswrapper[4766]: I1209 03:18:28.747878 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:28 crc kubenswrapper[4766]: I1209 03:18:28.796950 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:29 crc kubenswrapper[4766]: I1209 03:18:29.158355 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pt6rh" Dec 09 03:18:34 crc kubenswrapper[4766]: I1209 03:18:34.993854 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:35 crc kubenswrapper[4766]: I1209 03:18:35.036534 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 03:18:37 crc kubenswrapper[4766]: I1209 03:18:37.316675 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:18:37 crc kubenswrapper[4766]: I1209 03:18:37.316792 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:18:37 crc kubenswrapper[4766]: I1209 03:18:37.473772 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:37 crc kubenswrapper[4766]: I1209 03:18:37.519684 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 03:18:41 crc kubenswrapper[4766]: I1209 03:18:41.024996 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-885kq" Dec 09 03:18:41 crc kubenswrapper[4766]: I1209 03:18:41.071934 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gljdx"] Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.121667 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" podUID="59e81ed6-e457-4e59-a25e-b40dceea3cfd" containerName="registry" containerID="cri-o://2d97322ab85d0f1515da078d74c850d07cf70be43283fee04800e7a91b430a82" gracePeriod=30 Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.350697 4766 generic.go:334] "Generic (PLEG): container finished" podID="59e81ed6-e457-4e59-a25e-b40dceea3cfd" containerID="2d97322ab85d0f1515da078d74c850d07cf70be43283fee04800e7a91b430a82" exitCode=0 Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.350746 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" event={"ID":"59e81ed6-e457-4e59-a25e-b40dceea3cfd","Type":"ContainerDied","Data":"2d97322ab85d0f1515da078d74c850d07cf70be43283fee04800e7a91b430a82"} Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.531443 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.683890 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-bound-sa-token\") pod \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.683971 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e81ed6-e457-4e59-a25e-b40dceea3cfd-ca-trust-extracted\") pod \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.684317 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcmg4\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-kube-api-access-dcmg4\") pod \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.684367 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-trusted-ca\") pod \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.684416 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e81ed6-e457-4e59-a25e-b40dceea3cfd-installation-pull-secrets\") pod \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.685128 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "59e81ed6-e457-4e59-a25e-b40dceea3cfd" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.685846 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "59e81ed6-e457-4e59-a25e-b40dceea3cfd" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.685887 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-certificates\") pod \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.686042 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.686425 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-tls\") pod \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\" (UID: \"59e81ed6-e457-4e59-a25e-b40dceea3cfd\") " Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.688349 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.688879 4766 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.689905 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e81ed6-e457-4e59-a25e-b40dceea3cfd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "59e81ed6-e457-4e59-a25e-b40dceea3cfd" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.689963 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-kube-api-access-dcmg4" (OuterVolumeSpecName: "kube-api-access-dcmg4") pod "59e81ed6-e457-4e59-a25e-b40dceea3cfd" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd"). InnerVolumeSpecName "kube-api-access-dcmg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.690150 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "59e81ed6-e457-4e59-a25e-b40dceea3cfd" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.691855 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "59e81ed6-e457-4e59-a25e-b40dceea3cfd" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.695969 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "59e81ed6-e457-4e59-a25e-b40dceea3cfd" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.704095 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e81ed6-e457-4e59-a25e-b40dceea3cfd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "59e81ed6-e457-4e59-a25e-b40dceea3cfd" (UID: "59e81ed6-e457-4e59-a25e-b40dceea3cfd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.789577 4766 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.789605 4766 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.789615 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcmg4\" (UniqueName: \"kubernetes.io/projected/59e81ed6-e457-4e59-a25e-b40dceea3cfd-kube-api-access-dcmg4\") on node \"crc\" DevicePath \"\"" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.789626 4766 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59e81ed6-e457-4e59-a25e-b40dceea3cfd-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 03:19:06 crc kubenswrapper[4766]: I1209 03:19:06.789634 4766 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59e81ed6-e457-4e59-a25e-b40dceea3cfd-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.316916 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.317025 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.317108 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.318169 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"529f1759cbae2ffe7af74a6b51baacdf0d487f623ff2e9c4f8dd43b5b64d87b4"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.318397 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://529f1759cbae2ffe7af74a6b51baacdf0d487f623ff2e9c4f8dd43b5b64d87b4" gracePeriod=600 Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.358072 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" event={"ID":"59e81ed6-e457-4e59-a25e-b40dceea3cfd","Type":"ContainerDied","Data":"4ce2e10d3686ea19c2f372064a9ba48d36cbdfa5a5b6ae120b2874bf56805f76"} Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.358143 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gljdx" Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.358163 4766 scope.go:117] "RemoveContainer" containerID="2d97322ab85d0f1515da078d74c850d07cf70be43283fee04800e7a91b430a82" Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.385104 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gljdx"] Dec 09 03:19:07 crc kubenswrapper[4766]: I1209 03:19:07.392978 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gljdx"] Dec 09 03:19:08 crc kubenswrapper[4766]: I1209 03:19:08.370837 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="529f1759cbae2ffe7af74a6b51baacdf0d487f623ff2e9c4f8dd43b5b64d87b4" exitCode=0 Dec 09 03:19:08 crc kubenswrapper[4766]: I1209 03:19:08.370932 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"529f1759cbae2ffe7af74a6b51baacdf0d487f623ff2e9c4f8dd43b5b64d87b4"} Dec 09 03:19:08 crc kubenswrapper[4766]: I1209 03:19:08.371235 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"a3cdbbe8c466b323124372f294f955032904065097d8e70ff6e691e2b8b38bd2"} Dec 09 03:19:08 crc kubenswrapper[4766]: I1209 03:19:08.371261 4766 scope.go:117] "RemoveContainer" containerID="e63ec86ff1ecafd7feedef3d27cf91e0289fb1c5ed2b7d96ada69635bac6907e" Dec 09 03:19:08 crc kubenswrapper[4766]: I1209 03:19:08.845304 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e81ed6-e457-4e59-a25e-b40dceea3cfd" path="/var/lib/kubelet/pods/59e81ed6-e457-4e59-a25e-b40dceea3cfd/volumes" Dec 09 03:20:59 crc kubenswrapper[4766]: I1209 03:20:59.126301 4766 scope.go:117] "RemoveContainer" containerID="8f9fe9800961b2f53dca719e32f940880e60c12070d3697c1f0f141bdc5dd5a0" Dec 09 03:21:07 crc kubenswrapper[4766]: I1209 03:21:07.316858 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:21:07 crc kubenswrapper[4766]: I1209 03:21:07.317452 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:21:37 crc kubenswrapper[4766]: I1209 03:21:37.317183 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:21:37 crc kubenswrapper[4766]: I1209 03:21:37.317870 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:21:59 crc kubenswrapper[4766]: I1209 03:21:59.186435 4766 scope.go:117] "RemoveContainer" containerID="ff4b3f1f1aa94f8b3509d2686b104a29ecf6de4ac713a3cac4ac1fe00d7bfc66" Dec 09 03:21:59 crc kubenswrapper[4766]: I1209 03:21:59.205968 4766 scope.go:117] "RemoveContainer" containerID="5a4894beda7c306eaac579b1dfc0ebcb4741a95420863cd479edd89d8d6fd2a2" Dec 09 03:22:07 crc kubenswrapper[4766]: I1209 03:22:07.317123 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:22:07 crc kubenswrapper[4766]: I1209 03:22:07.318254 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:22:07 crc kubenswrapper[4766]: I1209 03:22:07.318348 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:22:07 crc kubenswrapper[4766]: I1209 03:22:07.319327 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3cdbbe8c466b323124372f294f955032904065097d8e70ff6e691e2b8b38bd2"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:22:07 crc kubenswrapper[4766]: I1209 03:22:07.319478 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://a3cdbbe8c466b323124372f294f955032904065097d8e70ff6e691e2b8b38bd2" gracePeriod=600 Dec 09 03:22:07 crc kubenswrapper[4766]: I1209 03:22:07.570000 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="a3cdbbe8c466b323124372f294f955032904065097d8e70ff6e691e2b8b38bd2" exitCode=0 Dec 09 03:22:07 crc kubenswrapper[4766]: I1209 03:22:07.570252 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"a3cdbbe8c466b323124372f294f955032904065097d8e70ff6e691e2b8b38bd2"} Dec 09 03:22:07 crc kubenswrapper[4766]: I1209 03:22:07.570511 4766 scope.go:117] "RemoveContainer" containerID="529f1759cbae2ffe7af74a6b51baacdf0d487f623ff2e9c4f8dd43b5b64d87b4" Dec 09 03:22:08 crc kubenswrapper[4766]: I1209 03:22:08.580612 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"d129c895ab3a94eff9c52289bd9c855b60f720ebf245b70e4c33b5f5b16f3734"} Dec 09 03:24:07 crc kubenswrapper[4766]: I1209 03:24:07.316845 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:24:07 crc kubenswrapper[4766]: I1209 03:24:07.317464 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:24:37 crc kubenswrapper[4766]: I1209 03:24:37.316812 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:24:37 crc kubenswrapper[4766]: I1209 03:24:37.317517 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:24:43 crc kubenswrapper[4766]: I1209 03:24:43.283812 4766 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 03:25:07 crc kubenswrapper[4766]: I1209 03:25:07.316882 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:25:07 crc kubenswrapper[4766]: I1209 03:25:07.317516 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:25:07 crc kubenswrapper[4766]: I1209 03:25:07.317564 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:25:07 crc kubenswrapper[4766]: I1209 03:25:07.318166 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d129c895ab3a94eff9c52289bd9c855b60f720ebf245b70e4c33b5f5b16f3734"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:25:07 crc kubenswrapper[4766]: I1209 03:25:07.318253 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://d129c895ab3a94eff9c52289bd9c855b60f720ebf245b70e4c33b5f5b16f3734" gracePeriod=600 Dec 09 03:25:07 crc kubenswrapper[4766]: I1209 03:25:07.933453 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="d129c895ab3a94eff9c52289bd9c855b60f720ebf245b70e4c33b5f5b16f3734" exitCode=0 Dec 09 03:25:07 crc kubenswrapper[4766]: I1209 03:25:07.933628 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"d129c895ab3a94eff9c52289bd9c855b60f720ebf245b70e4c33b5f5b16f3734"} Dec 09 03:25:07 crc kubenswrapper[4766]: I1209 03:25:07.934008 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"2f05dd63b19e38608a1f279fa1bc992ee750c24d1fb42681cf4619e9e86acf2d"} Dec 09 03:25:07 crc kubenswrapper[4766]: I1209 03:25:07.934050 4766 scope.go:117] "RemoveContainer" containerID="a3cdbbe8c466b323124372f294f955032904065097d8e70ff6e691e2b8b38bd2" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.548364 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ll6g8"] Dec 09 03:25:14 crc kubenswrapper[4766]: E1209 03:25:14.549602 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e81ed6-e457-4e59-a25e-b40dceea3cfd" containerName="registry" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.549617 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e81ed6-e457-4e59-a25e-b40dceea3cfd" containerName="registry" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.549728 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e81ed6-e457-4e59-a25e-b40dceea3cfd" containerName="registry" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.556013 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.557689 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ll6g8"] Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.663332 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjxv\" (UniqueName: \"kubernetes.io/projected/5efa157d-584a-4209-a867-2fc11da31f76-kube-api-access-5jjxv\") pod \"redhat-operators-ll6g8\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.663379 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-catalog-content\") pod \"redhat-operators-ll6g8\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.663427 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-utilities\") pod \"redhat-operators-ll6g8\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.764858 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-utilities\") pod \"redhat-operators-ll6g8\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.764953 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjxv\" (UniqueName: \"kubernetes.io/projected/5efa157d-584a-4209-a867-2fc11da31f76-kube-api-access-5jjxv\") pod \"redhat-operators-ll6g8\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.764987 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-catalog-content\") pod \"redhat-operators-ll6g8\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.765499 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-utilities\") pod \"redhat-operators-ll6g8\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.765532 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-catalog-content\") pod \"redhat-operators-ll6g8\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.797080 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjxv\" (UniqueName: \"kubernetes.io/projected/5efa157d-584a-4209-a867-2fc11da31f76-kube-api-access-5jjxv\") pod \"redhat-operators-ll6g8\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:14 crc kubenswrapper[4766]: I1209 03:25:14.891519 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:15 crc kubenswrapper[4766]: I1209 03:25:15.308259 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ll6g8"] Dec 09 03:25:15 crc kubenswrapper[4766]: I1209 03:25:15.985754 4766 generic.go:334] "Generic (PLEG): container finished" podID="5efa157d-584a-4209-a867-2fc11da31f76" containerID="b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675" exitCode=0 Dec 09 03:25:15 crc kubenswrapper[4766]: I1209 03:25:15.985799 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll6g8" event={"ID":"5efa157d-584a-4209-a867-2fc11da31f76","Type":"ContainerDied","Data":"b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675"} Dec 09 03:25:15 crc kubenswrapper[4766]: I1209 03:25:15.986084 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll6g8" event={"ID":"5efa157d-584a-4209-a867-2fc11da31f76","Type":"ContainerStarted","Data":"ccaf1a203aa23b2325ec5a3275b979f4f0a6f826c909fc4031efcb82024f4de8"} Dec 09 03:25:15 crc kubenswrapper[4766]: I1209 03:25:15.987001 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 03:25:16 crc kubenswrapper[4766]: I1209 03:25:16.998823 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll6g8" event={"ID":"5efa157d-584a-4209-a867-2fc11da31f76","Type":"ContainerStarted","Data":"665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28"} Dec 09 03:25:18 crc kubenswrapper[4766]: I1209 03:25:18.010387 4766 generic.go:334] "Generic (PLEG): container finished" podID="5efa157d-584a-4209-a867-2fc11da31f76" containerID="665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28" exitCode=0 Dec 09 03:25:18 crc kubenswrapper[4766]: I1209 03:25:18.010514 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll6g8" event={"ID":"5efa157d-584a-4209-a867-2fc11da31f76","Type":"ContainerDied","Data":"665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28"} Dec 09 03:25:19 crc kubenswrapper[4766]: I1209 03:25:19.021079 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll6g8" event={"ID":"5efa157d-584a-4209-a867-2fc11da31f76","Type":"ContainerStarted","Data":"792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675"} Dec 09 03:25:19 crc kubenswrapper[4766]: I1209 03:25:19.053038 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ll6g8" podStartSLOduration=2.602077174 podStartE2EDuration="5.053015584s" podCreationTimestamp="2025-12-09 03:25:14 +0000 UTC" firstStartedPulling="2025-12-09 03:25:15.986822915 +0000 UTC m=+797.696128341" lastFinishedPulling="2025-12-09 03:25:18.437761325 +0000 UTC m=+800.147066751" observedRunningTime="2025-12-09 03:25:19.04834699 +0000 UTC m=+800.757652456" watchObservedRunningTime="2025-12-09 03:25:19.053015584 +0000 UTC m=+800.762321040" Dec 09 03:25:24 crc kubenswrapper[4766]: I1209 03:25:24.891821 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:24 crc kubenswrapper[4766]: I1209 03:25:24.892698 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:25 crc kubenswrapper[4766]: I1209 03:25:25.959080 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ll6g8" podUID="5efa157d-584a-4209-a867-2fc11da31f76" containerName="registry-server" probeResult="failure" output=< Dec 09 03:25:25 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 03:25:25 crc kubenswrapper[4766]: > Dec 09 03:25:34 crc kubenswrapper[4766]: I1209 03:25:34.978151 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:35 crc kubenswrapper[4766]: I1209 03:25:35.029531 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:35 crc kubenswrapper[4766]: I1209 03:25:35.218907 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ll6g8"] Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.129179 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ll6g8" podUID="5efa157d-584a-4209-a867-2fc11da31f76" containerName="registry-server" containerID="cri-o://792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675" gracePeriod=2 Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.471743 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.556922 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jjxv\" (UniqueName: \"kubernetes.io/projected/5efa157d-584a-4209-a867-2fc11da31f76-kube-api-access-5jjxv\") pod \"5efa157d-584a-4209-a867-2fc11da31f76\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.557022 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-utilities\") pod \"5efa157d-584a-4209-a867-2fc11da31f76\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.557193 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-catalog-content\") pod \"5efa157d-584a-4209-a867-2fc11da31f76\" (UID: \"5efa157d-584a-4209-a867-2fc11da31f76\") " Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.558330 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-utilities" (OuterVolumeSpecName: "utilities") pod "5efa157d-584a-4209-a867-2fc11da31f76" (UID: "5efa157d-584a-4209-a867-2fc11da31f76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.564999 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5efa157d-584a-4209-a867-2fc11da31f76-kube-api-access-5jjxv" (OuterVolumeSpecName: "kube-api-access-5jjxv") pod "5efa157d-584a-4209-a867-2fc11da31f76" (UID: "5efa157d-584a-4209-a867-2fc11da31f76"). InnerVolumeSpecName "kube-api-access-5jjxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.658922 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jjxv\" (UniqueName: \"kubernetes.io/projected/5efa157d-584a-4209-a867-2fc11da31f76-kube-api-access-5jjxv\") on node \"crc\" DevicePath \"\"" Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.658961 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.736941 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5efa157d-584a-4209-a867-2fc11da31f76" (UID: "5efa157d-584a-4209-a867-2fc11da31f76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:25:36 crc kubenswrapper[4766]: I1209 03:25:36.760508 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5efa157d-584a-4209-a867-2fc11da31f76-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.139666 4766 generic.go:334] "Generic (PLEG): container finished" podID="5efa157d-584a-4209-a867-2fc11da31f76" containerID="792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675" exitCode=0 Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.139769 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll6g8" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.139796 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll6g8" event={"ID":"5efa157d-584a-4209-a867-2fc11da31f76","Type":"ContainerDied","Data":"792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675"} Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.140651 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll6g8" event={"ID":"5efa157d-584a-4209-a867-2fc11da31f76","Type":"ContainerDied","Data":"ccaf1a203aa23b2325ec5a3275b979f4f0a6f826c909fc4031efcb82024f4de8"} Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.140697 4766 scope.go:117] "RemoveContainer" containerID="792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.166525 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ll6g8"] Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.173282 4766 scope.go:117] "RemoveContainer" containerID="665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.174983 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ll6g8"] Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.201543 4766 scope.go:117] "RemoveContainer" containerID="b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.222068 4766 scope.go:117] "RemoveContainer" containerID="792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675" Dec 09 03:25:37 crc kubenswrapper[4766]: E1209 03:25:37.222578 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675\": container with ID starting with 792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675 not found: ID does not exist" containerID="792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.222618 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675"} err="failed to get container status \"792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675\": rpc error: code = NotFound desc = could not find container \"792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675\": container with ID starting with 792ecbe65cf3025dc1a50ba8c0e8287ea94dd983f72e4c2d81c3026236c4a675 not found: ID does not exist" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.222649 4766 scope.go:117] "RemoveContainer" containerID="665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28" Dec 09 03:25:37 crc kubenswrapper[4766]: E1209 03:25:37.223196 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28\": container with ID starting with 665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28 not found: ID does not exist" containerID="665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.223252 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28"} err="failed to get container status \"665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28\": rpc error: code = NotFound desc = could not find container \"665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28\": container with ID starting with 665836bfcfa8d18c12f078bc0d9ee37de3512e9a1472d3c8af063058ab341c28 not found: ID does not exist" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.223278 4766 scope.go:117] "RemoveContainer" containerID="b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675" Dec 09 03:25:37 crc kubenswrapper[4766]: E1209 03:25:37.224145 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675\": container with ID starting with b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675 not found: ID does not exist" containerID="b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675" Dec 09 03:25:37 crc kubenswrapper[4766]: I1209 03:25:37.224262 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675"} err="failed to get container status \"b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675\": rpc error: code = NotFound desc = could not find container \"b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675\": container with ID starting with b2f8f6b282328c86072c1a723971b92946422d37e4ce2d64681813aa68b5f675 not found: ID does not exist" Dec 09 03:25:38 crc kubenswrapper[4766]: I1209 03:25:38.848315 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5efa157d-584a-4209-a867-2fc11da31f76" path="/var/lib/kubelet/pods/5efa157d-584a-4209-a867-2fc11da31f76/volumes" Dec 09 03:26:21 crc kubenswrapper[4766]: I1209 03:26:21.895384 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nncw8"] Dec 09 03:26:21 crc kubenswrapper[4766]: E1209 03:26:21.896504 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efa157d-584a-4209-a867-2fc11da31f76" containerName="extract-utilities" Dec 09 03:26:21 crc kubenswrapper[4766]: I1209 03:26:21.896527 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efa157d-584a-4209-a867-2fc11da31f76" containerName="extract-utilities" Dec 09 03:26:21 crc kubenswrapper[4766]: E1209 03:26:21.896558 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efa157d-584a-4209-a867-2fc11da31f76" containerName="registry-server" Dec 09 03:26:21 crc kubenswrapper[4766]: I1209 03:26:21.896571 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efa157d-584a-4209-a867-2fc11da31f76" containerName="registry-server" Dec 09 03:26:21 crc kubenswrapper[4766]: E1209 03:26:21.896598 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efa157d-584a-4209-a867-2fc11da31f76" containerName="extract-content" Dec 09 03:26:21 crc kubenswrapper[4766]: I1209 03:26:21.896611 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efa157d-584a-4209-a867-2fc11da31f76" containerName="extract-content" Dec 09 03:26:21 crc kubenswrapper[4766]: I1209 03:26:21.896785 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5efa157d-584a-4209-a867-2fc11da31f76" containerName="registry-server" Dec 09 03:26:21 crc kubenswrapper[4766]: I1209 03:26:21.898099 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:21 crc kubenswrapper[4766]: I1209 03:26:21.912612 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nncw8"] Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.072925 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-catalog-content\") pod \"redhat-marketplace-nncw8\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.073165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-886hb\" (UniqueName: \"kubernetes.io/projected/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-kube-api-access-886hb\") pod \"redhat-marketplace-nncw8\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.073204 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-utilities\") pod \"redhat-marketplace-nncw8\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.174020 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-catalog-content\") pod \"redhat-marketplace-nncw8\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.174078 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-886hb\" (UniqueName: \"kubernetes.io/projected/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-kube-api-access-886hb\") pod \"redhat-marketplace-nncw8\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.174133 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-utilities\") pod \"redhat-marketplace-nncw8\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.174784 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-catalog-content\") pod \"redhat-marketplace-nncw8\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.174810 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-utilities\") pod \"redhat-marketplace-nncw8\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.209078 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-886hb\" (UniqueName: \"kubernetes.io/projected/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-kube-api-access-886hb\") pod \"redhat-marketplace-nncw8\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.293815 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:22 crc kubenswrapper[4766]: I1209 03:26:22.507556 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nncw8"] Dec 09 03:26:23 crc kubenswrapper[4766]: I1209 03:26:23.459859 4766 generic.go:334] "Generic (PLEG): container finished" podID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerID="6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537" exitCode=0 Dec 09 03:26:23 crc kubenswrapper[4766]: I1209 03:26:23.459993 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nncw8" event={"ID":"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2","Type":"ContainerDied","Data":"6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537"} Dec 09 03:26:23 crc kubenswrapper[4766]: I1209 03:26:23.461564 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nncw8" event={"ID":"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2","Type":"ContainerStarted","Data":"bd9d62a78f4a7a575e40946561995c9f238e0ee8c22d6bfa6c94af6bab39e395"} Dec 09 03:26:24 crc kubenswrapper[4766]: I1209 03:26:24.472084 4766 generic.go:334] "Generic (PLEG): container finished" podID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerID="10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47" exitCode=0 Dec 09 03:26:24 crc kubenswrapper[4766]: I1209 03:26:24.472242 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nncw8" event={"ID":"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2","Type":"ContainerDied","Data":"10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47"} Dec 09 03:26:25 crc kubenswrapper[4766]: I1209 03:26:25.503086 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nncw8" event={"ID":"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2","Type":"ContainerStarted","Data":"4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab"} Dec 09 03:26:25 crc kubenswrapper[4766]: I1209 03:26:25.540891 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nncw8" podStartSLOduration=3.160453767 podStartE2EDuration="4.540861016s" podCreationTimestamp="2025-12-09 03:26:21 +0000 UTC" firstStartedPulling="2025-12-09 03:26:23.462030438 +0000 UTC m=+865.171335894" lastFinishedPulling="2025-12-09 03:26:24.842437677 +0000 UTC m=+866.551743143" observedRunningTime="2025-12-09 03:26:25.533082957 +0000 UTC m=+867.242388393" watchObservedRunningTime="2025-12-09 03:26:25.540861016 +0000 UTC m=+867.250166482" Dec 09 03:26:32 crc kubenswrapper[4766]: I1209 03:26:32.293977 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:32 crc kubenswrapper[4766]: I1209 03:26:32.294625 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:32 crc kubenswrapper[4766]: I1209 03:26:32.368755 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:32 crc kubenswrapper[4766]: I1209 03:26:32.637866 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:32 crc kubenswrapper[4766]: I1209 03:26:32.688732 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nncw8"] Dec 09 03:26:34 crc kubenswrapper[4766]: I1209 03:26:34.585101 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nncw8" podUID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerName="registry-server" containerID="cri-o://4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab" gracePeriod=2 Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.523636 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.596556 4766 generic.go:334] "Generic (PLEG): container finished" podID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerID="4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab" exitCode=0 Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.596616 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nncw8" event={"ID":"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2","Type":"ContainerDied","Data":"4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab"} Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.596663 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nncw8" event={"ID":"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2","Type":"ContainerDied","Data":"bd9d62a78f4a7a575e40946561995c9f238e0ee8c22d6bfa6c94af6bab39e395"} Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.596674 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nncw8" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.596685 4766 scope.go:117] "RemoveContainer" containerID="4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.622287 4766 scope.go:117] "RemoveContainer" containerID="10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.645350 4766 scope.go:117] "RemoveContainer" containerID="6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.659870 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-886hb\" (UniqueName: \"kubernetes.io/projected/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-kube-api-access-886hb\") pod \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.659962 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-catalog-content\") pod \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.660072 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-utilities\") pod \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\" (UID: \"a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2\") " Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.661723 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-utilities" (OuterVolumeSpecName: "utilities") pod "a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" (UID: "a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.668943 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-kube-api-access-886hb" (OuterVolumeSpecName: "kube-api-access-886hb") pod "a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" (UID: "a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2"). InnerVolumeSpecName "kube-api-access-886hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.672043 4766 scope.go:117] "RemoveContainer" containerID="4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab" Dec 09 03:26:35 crc kubenswrapper[4766]: E1209 03:26:35.672530 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab\": container with ID starting with 4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab not found: ID does not exist" containerID="4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.672597 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab"} err="failed to get container status \"4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab\": rpc error: code = NotFound desc = could not find container \"4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab\": container with ID starting with 4d0d35e87532fb44be5ae8f1d79e68ad354910110ab03aaf09262aeb7e72ddab not found: ID does not exist" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.672636 4766 scope.go:117] "RemoveContainer" containerID="10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47" Dec 09 03:26:35 crc kubenswrapper[4766]: E1209 03:26:35.673064 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47\": container with ID starting with 10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47 not found: ID does not exist" containerID="10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.673110 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47"} err="failed to get container status \"10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47\": rpc error: code = NotFound desc = could not find container \"10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47\": container with ID starting with 10f9314252e36218974e8865f7656522f3d8729df14157a21f45f4c9c8421d47 not found: ID does not exist" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.673141 4766 scope.go:117] "RemoveContainer" containerID="6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537" Dec 09 03:26:35 crc kubenswrapper[4766]: E1209 03:26:35.673499 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537\": container with ID starting with 6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537 not found: ID does not exist" containerID="6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.673544 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537"} err="failed to get container status \"6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537\": rpc error: code = NotFound desc = could not find container \"6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537\": container with ID starting with 6eb875f8e3d0c887b7d213b3c2851578c53487b2f14177978f400512c13d0537 not found: ID does not exist" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.685480 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" (UID: "a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.761833 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-886hb\" (UniqueName: \"kubernetes.io/projected/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-kube-api-access-886hb\") on node \"crc\" DevicePath \"\"" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.761885 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.761906 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.948052 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nncw8"] Dec 09 03:26:35 crc kubenswrapper[4766]: I1209 03:26:35.955573 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nncw8"] Dec 09 03:26:36 crc kubenswrapper[4766]: I1209 03:26:36.851059 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" path="/var/lib/kubelet/pods/a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2/volumes" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.716325 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-2ms6v"] Dec 09 03:27:03 crc kubenswrapper[4766]: E1209 03:27:03.718781 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerName="extract-utilities" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.718983 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerName="extract-utilities" Dec 09 03:27:03 crc kubenswrapper[4766]: E1209 03:27:03.719181 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerName="extract-content" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.719429 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerName="extract-content" Dec 09 03:27:03 crc kubenswrapper[4766]: E1209 03:27:03.719627 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerName="registry-server" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.719789 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerName="registry-server" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.720181 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ee591e-8d0f-43cf-8d8f-049f72a7f1e2" containerName="registry-server" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.721043 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.724680 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2ms6v"] Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.726083 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.726401 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.726757 4766 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xhpdj" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.727289 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.854294 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a7ab16b-3406-45b8-87bb-94799ff01d2e-crc-storage\") pod \"crc-storage-crc-2ms6v\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.854605 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a7ab16b-3406-45b8-87bb-94799ff01d2e-node-mnt\") pod \"crc-storage-crc-2ms6v\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.854755 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdj4d\" (UniqueName: \"kubernetes.io/projected/7a7ab16b-3406-45b8-87bb-94799ff01d2e-kube-api-access-mdj4d\") pod \"crc-storage-crc-2ms6v\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.955707 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a7ab16b-3406-45b8-87bb-94799ff01d2e-crc-storage\") pod \"crc-storage-crc-2ms6v\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.955769 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a7ab16b-3406-45b8-87bb-94799ff01d2e-node-mnt\") pod \"crc-storage-crc-2ms6v\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.955804 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdj4d\" (UniqueName: \"kubernetes.io/projected/7a7ab16b-3406-45b8-87bb-94799ff01d2e-kube-api-access-mdj4d\") pod \"crc-storage-crc-2ms6v\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.956143 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a7ab16b-3406-45b8-87bb-94799ff01d2e-node-mnt\") pod \"crc-storage-crc-2ms6v\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.957048 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a7ab16b-3406-45b8-87bb-94799ff01d2e-crc-storage\") pod \"crc-storage-crc-2ms6v\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:03 crc kubenswrapper[4766]: I1209 03:27:03.989152 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdj4d\" (UniqueName: \"kubernetes.io/projected/7a7ab16b-3406-45b8-87bb-94799ff01d2e-kube-api-access-mdj4d\") pod \"crc-storage-crc-2ms6v\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:04 crc kubenswrapper[4766]: I1209 03:27:04.039819 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:04 crc kubenswrapper[4766]: I1209 03:27:04.314362 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2ms6v"] Dec 09 03:27:04 crc kubenswrapper[4766]: I1209 03:27:04.793950 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2ms6v" event={"ID":"7a7ab16b-3406-45b8-87bb-94799ff01d2e","Type":"ContainerStarted","Data":"e0f937d9e1ec00fea26fb82f843a51cf61d3c03537becd31af5e120d03b76f71"} Dec 09 03:27:05 crc kubenswrapper[4766]: I1209 03:27:05.802550 4766 generic.go:334] "Generic (PLEG): container finished" podID="7a7ab16b-3406-45b8-87bb-94799ff01d2e" containerID="bdceedd8d8ebb9eb3cafcf36d93bd17206914478df573062d74cfd4a92b18e29" exitCode=0 Dec 09 03:27:05 crc kubenswrapper[4766]: I1209 03:27:05.802658 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2ms6v" event={"ID":"7a7ab16b-3406-45b8-87bb-94799ff01d2e","Type":"ContainerDied","Data":"bdceedd8d8ebb9eb3cafcf36d93bd17206914478df573062d74cfd4a92b18e29"} Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.039469 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.197159 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a7ab16b-3406-45b8-87bb-94799ff01d2e-crc-storage\") pod \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.197261 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a7ab16b-3406-45b8-87bb-94799ff01d2e-node-mnt\") pod \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.197285 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdj4d\" (UniqueName: \"kubernetes.io/projected/7a7ab16b-3406-45b8-87bb-94799ff01d2e-kube-api-access-mdj4d\") pod \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\" (UID: \"7a7ab16b-3406-45b8-87bb-94799ff01d2e\") " Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.197409 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a7ab16b-3406-45b8-87bb-94799ff01d2e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7a7ab16b-3406-45b8-87bb-94799ff01d2e" (UID: "7a7ab16b-3406-45b8-87bb-94799ff01d2e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.197844 4766 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a7ab16b-3406-45b8-87bb-94799ff01d2e-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.204529 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7ab16b-3406-45b8-87bb-94799ff01d2e-kube-api-access-mdj4d" (OuterVolumeSpecName: "kube-api-access-mdj4d") pod "7a7ab16b-3406-45b8-87bb-94799ff01d2e" (UID: "7a7ab16b-3406-45b8-87bb-94799ff01d2e"). InnerVolumeSpecName "kube-api-access-mdj4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.220101 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a7ab16b-3406-45b8-87bb-94799ff01d2e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7a7ab16b-3406-45b8-87bb-94799ff01d2e" (UID: "7a7ab16b-3406-45b8-87bb-94799ff01d2e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.299097 4766 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a7ab16b-3406-45b8-87bb-94799ff01d2e-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.299133 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdj4d\" (UniqueName: \"kubernetes.io/projected/7a7ab16b-3406-45b8-87bb-94799ff01d2e-kube-api-access-mdj4d\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.316903 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.316959 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.818200 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2ms6v" event={"ID":"7a7ab16b-3406-45b8-87bb-94799ff01d2e","Type":"ContainerDied","Data":"e0f937d9e1ec00fea26fb82f843a51cf61d3c03537becd31af5e120d03b76f71"} Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.818290 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f937d9e1ec00fea26fb82f843a51cf61d3c03537becd31af5e120d03b76f71" Dec 09 03:27:07 crc kubenswrapper[4766]: I1209 03:27:07.818308 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2ms6v" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.120257 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9"] Dec 09 03:27:16 crc kubenswrapper[4766]: E1209 03:27:16.120815 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7ab16b-3406-45b8-87bb-94799ff01d2e" containerName="storage" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.120826 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7ab16b-3406-45b8-87bb-94799ff01d2e" containerName="storage" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.120921 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7ab16b-3406-45b8-87bb-94799ff01d2e" containerName="storage" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.121663 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.123279 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.160905 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9"] Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.220742 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.220816 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.220882 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l47z\" (UniqueName: \"kubernetes.io/projected/63e40c3d-4950-4959-8841-11e962ffc2af-kube-api-access-5l47z\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.322464 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.322506 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.322731 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l47z\" (UniqueName: \"kubernetes.io/projected/63e40c3d-4950-4959-8841-11e962ffc2af-kube-api-access-5l47z\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.323436 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.323557 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.348941 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l47z\" (UniqueName: \"kubernetes.io/projected/63e40c3d-4950-4959-8841-11e962ffc2af-kube-api-access-5l47z\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.445641 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.649745 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9"] Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.873145 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" event={"ID":"63e40c3d-4950-4959-8841-11e962ffc2af","Type":"ContainerStarted","Data":"53190e9d43cf4f2132882f15aaa9fefe022256fa11f5428719cdf7ab78b1981e"} Dec 09 03:27:16 crc kubenswrapper[4766]: I1209 03:27:16.873446 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" event={"ID":"63e40c3d-4950-4959-8841-11e962ffc2af","Type":"ContainerStarted","Data":"cce239656c51ed2d13ca9742112989e95b0643c9d9fee88bbdce6c46cbcfb493"} Dec 09 03:27:17 crc kubenswrapper[4766]: I1209 03:27:17.881916 4766 generic.go:334] "Generic (PLEG): container finished" podID="63e40c3d-4950-4959-8841-11e962ffc2af" containerID="53190e9d43cf4f2132882f15aaa9fefe022256fa11f5428719cdf7ab78b1981e" exitCode=0 Dec 09 03:27:17 crc kubenswrapper[4766]: I1209 03:27:17.883810 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" event={"ID":"63e40c3d-4950-4959-8841-11e962ffc2af","Type":"ContainerDied","Data":"53190e9d43cf4f2132882f15aaa9fefe022256fa11f5428719cdf7ab78b1981e"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.337493 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-62t52"] Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.338145 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovn-controller" containerID="cri-o://d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12" gracePeriod=30 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.338287 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="northd" containerID="cri-o://e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e" gracePeriod=30 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.338344 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kube-rbac-proxy-node" containerID="cri-o://65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71" gracePeriod=30 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.338287 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="nbdb" containerID="cri-o://84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d" gracePeriod=30 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.338467 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7" gracePeriod=30 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.338450 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="sbdb" containerID="cri-o://c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82" gracePeriod=30 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.338410 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovn-acl-logging" containerID="cri-o://f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d" gracePeriod=30 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.415866 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" containerID="cri-o://a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1" gracePeriod=30 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.727585 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/3.log" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.729773 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovn-acl-logging/0.log" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.730772 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovn-controller/0.log" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.731260 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.758726 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-bin\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.758794 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-netd\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.758833 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-systemd-units\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.758880 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovn-node-metrics-cert\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.758927 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-ovn-kubernetes\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.758955 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-ovn\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.758985 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-script-lib\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759013 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-systemd\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759042 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-netns\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759072 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-slash\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759141 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-env-overrides\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759195 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-config\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759267 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tw62\" (UniqueName: \"kubernetes.io/projected/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-kube-api-access-8tw62\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759308 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-openvswitch\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759342 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-kubelet\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759384 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759439 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-node-log\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759467 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-var-lib-openvswitch\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759508 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-log-socket\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759539 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-etc-openvswitch\") pod \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\" (UID: \"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec\") " Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759898 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759949 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.759985 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.760798 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.760845 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.761783 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.761849 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.761875 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.761898 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.761921 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-node-log" (OuterVolumeSpecName: "node-log") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.761944 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.761966 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-log-socket" (OuterVolumeSpecName: "log-socket") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.761990 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-slash" (OuterVolumeSpecName: "host-slash") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.763138 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.763567 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.763578 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.765253 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.760586 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.765582 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-kube-api-access-8tw62" (OuterVolumeSpecName: "kube-api-access-8tw62") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "kube-api-access-8tw62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.785926 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" (UID: "d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.796986 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mjnb6"] Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797317 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="sbdb" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797335 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="sbdb" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797349 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovn-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797361 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovn-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797377 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797389 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797404 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kubecfg-setup" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797414 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kubecfg-setup" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797429 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797440 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797460 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kube-rbac-proxy-node" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797471 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kube-rbac-proxy-node" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797488 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797499 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797512 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovn-acl-logging" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797522 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovn-acl-logging" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797542 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797553 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797568 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="nbdb" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797579 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="nbdb" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.797594 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="northd" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797605 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="northd" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797747 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="nbdb" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797762 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797779 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797799 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="sbdb" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797815 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797830 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797842 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="kube-rbac-proxy-node" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797858 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovn-acl-logging" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797898 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovn-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.797913 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="northd" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.798070 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.798083 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: E1209 03:27:18.798100 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.798111 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.798296 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.798311 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerName="ovnkube-controller" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.800766 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860691 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-run-ovn-kubernetes\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860743 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-env-overrides\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860770 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-cni-netd\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860799 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-slash\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860820 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-var-lib-openvswitch\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860847 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-run-systemd\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860871 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860895 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-kubelet\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860913 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-cni-bin\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.860935 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-etc-openvswitch\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861121 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-run-netns\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861176 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-ovnkube-script-lib\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861196 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kmgb\" (UniqueName: \"kubernetes.io/projected/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-kube-api-access-2kmgb\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861271 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-ovnkube-config\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861318 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-log-socket\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861360 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-systemd-units\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861402 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-run-ovn\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861445 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-run-openvswitch\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861487 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-ovn-node-metrics-cert\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861515 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-node-log\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861558 4766 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861573 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861586 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tw62\" (UniqueName: \"kubernetes.io/projected/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-kube-api-access-8tw62\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861598 4766 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861612 4766 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861625 4766 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861637 4766 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861646 4766 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-node-log\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861656 4766 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-log-socket\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861667 4766 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861678 4766 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861690 4766 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861701 4766 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861713 4766 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861725 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861735 4766 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861745 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861757 4766 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-slash\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861767 4766 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.861777 4766 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.890348 4766 generic.go:334] "Generic (PLEG): container finished" podID="63e40c3d-4950-4959-8841-11e962ffc2af" containerID="00ac2948c4cc85868aad867a706a6047f5297ffde1a29fd87491f12b56188dc0" exitCode=0 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.890424 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" event={"ID":"63e40c3d-4950-4959-8841-11e962ffc2af","Type":"ContainerDied","Data":"00ac2948c4cc85868aad867a706a6047f5297ffde1a29fd87491f12b56188dc0"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.893705 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovnkube-controller/3.log" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.896530 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovn-acl-logging/0.log" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.896966 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-62t52_d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/ovn-controller/0.log" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.897496 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1" exitCode=0 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.897637 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82" exitCode=0 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.897687 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d" exitCode=0 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.897744 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e" exitCode=0 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.897810 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7" exitCode=0 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.897873 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71" exitCode=0 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.897952 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d" exitCode=143 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898016 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" containerID="d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12" exitCode=143 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898226 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898555 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898643 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898660 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898675 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898688 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898699 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898711 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898723 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898730 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898737 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898744 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898751 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898757 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898764 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898771 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898780 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898784 4766 scope.go:117] "RemoveContainer" containerID="a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898791 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898920 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898935 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898941 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898949 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898970 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898976 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898981 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898986 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.898991 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899007 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899024 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899045 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899050 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899056 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899060 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899066 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899070 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899075 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899080 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899085 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899092 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-62t52" event={"ID":"d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec","Type":"ContainerDied","Data":"e1e35f022910b1a7046ab8cdbfdb4cd4a191657cfb40af89d3014ce856160bd4"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899100 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899123 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899129 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899135 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899140 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899146 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899150 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899155 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899160 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.899164 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.902753 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/2.log" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.903143 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/1.log" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.903253 4766 generic.go:334] "Generic (PLEG): container finished" podID="c83a9d31-9c87-4a13-ab9a-2992e852eb47" containerID="25bcb6c0cf26212f22beb70f607504f6c683f40eb583a1e29ff90970937bad7a" exitCode=2 Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.903492 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gx9l2" event={"ID":"c83a9d31-9c87-4a13-ab9a-2992e852eb47","Type":"ContainerDied","Data":"25bcb6c0cf26212f22beb70f607504f6c683f40eb583a1e29ff90970937bad7a"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.903576 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4"} Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.904071 4766 scope.go:117] "RemoveContainer" containerID="25bcb6c0cf26212f22beb70f607504f6c683f40eb583a1e29ff90970937bad7a" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.953388 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.957640 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-62t52"] Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.962462 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-62t52"] Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.962891 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-log-socket\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.962926 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-systemd-units\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.962972 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-run-ovn\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.962998 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-run-openvswitch\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963023 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-ovn-node-metrics-cert\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963051 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-node-log\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963052 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-systemd-units\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963074 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-run-ovn-kubernetes\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963096 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-env-overrides\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963118 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-cni-netd\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963121 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-log-socket\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963143 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-slash\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963159 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-node-log\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963166 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-var-lib-openvswitch\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-run-ovn\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963226 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-run-systemd\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963255 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963276 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-run-openvswitch\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963285 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-kubelet\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963308 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-cni-bin\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963328 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-var-lib-openvswitch\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963332 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-etc-openvswitch\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963363 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-etc-openvswitch\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963426 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-run-netns\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963470 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-ovnkube-script-lib\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963488 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kmgb\" (UniqueName: \"kubernetes.io/projected/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-kube-api-access-2kmgb\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963511 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-ovnkube-config\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963688 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-run-netns\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963732 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-cni-netd\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963758 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-slash\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963917 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-run-systemd\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963952 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963979 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-env-overrides\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963994 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-run-ovn-kubernetes\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.963981 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-kubelet\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.964002 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-host-cni-bin\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.964270 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-ovnkube-config\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.964823 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-ovnkube-script-lib\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.966805 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-ovn-node-metrics-cert\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.978209 4766 scope.go:117] "RemoveContainer" containerID="c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.980496 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kmgb\" (UniqueName: \"kubernetes.io/projected/4a509a2a-0e08-4d21-a377-cbd0aebe0ba2-kube-api-access-2kmgb\") pod \"ovnkube-node-mjnb6\" (UID: \"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:18 crc kubenswrapper[4766]: I1209 03:27:18.990122 4766 scope.go:117] "RemoveContainer" containerID="84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.002483 4766 scope.go:117] "RemoveContainer" containerID="e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.013241 4766 scope.go:117] "RemoveContainer" containerID="e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.025569 4766 scope.go:117] "RemoveContainer" containerID="65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.041837 4766 scope.go:117] "RemoveContainer" containerID="f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.058859 4766 scope.go:117] "RemoveContainer" containerID="d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.072077 4766 scope.go:117] "RemoveContainer" containerID="8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.086892 4766 scope.go:117] "RemoveContainer" containerID="a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.087849 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": container with ID starting with a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1 not found: ID does not exist" containerID="a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.087904 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} err="failed to get container status \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": rpc error: code = NotFound desc = could not find container \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": container with ID starting with a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.087933 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.088288 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\": container with ID starting with 961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc not found: ID does not exist" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.088349 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} err="failed to get container status \"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\": rpc error: code = NotFound desc = could not find container \"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\": container with ID starting with 961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.088367 4766 scope.go:117] "RemoveContainer" containerID="c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.088743 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\": container with ID starting with c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82 not found: ID does not exist" containerID="c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.088771 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} err="failed to get container status \"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\": rpc error: code = NotFound desc = could not find container \"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\": container with ID starting with c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.088791 4766 scope.go:117] "RemoveContainer" containerID="84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.089005 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\": container with ID starting with 84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d not found: ID does not exist" containerID="84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.089029 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} err="failed to get container status \"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\": rpc error: code = NotFound desc = could not find container \"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\": container with ID starting with 84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.089046 4766 scope.go:117] "RemoveContainer" containerID="e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.089281 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\": container with ID starting with e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e not found: ID does not exist" containerID="e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.089313 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} err="failed to get container status \"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\": rpc error: code = NotFound desc = could not find container \"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\": container with ID starting with e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.089330 4766 scope.go:117] "RemoveContainer" containerID="e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.089542 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\": container with ID starting with e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7 not found: ID does not exist" containerID="e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.089566 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} err="failed to get container status \"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\": rpc error: code = NotFound desc = could not find container \"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\": container with ID starting with e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.089581 4766 scope.go:117] "RemoveContainer" containerID="65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.090489 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\": container with ID starting with 65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71 not found: ID does not exist" containerID="65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.090518 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} err="failed to get container status \"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\": rpc error: code = NotFound desc = could not find container \"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\": container with ID starting with 65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.090536 4766 scope.go:117] "RemoveContainer" containerID="f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.090800 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\": container with ID starting with f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d not found: ID does not exist" containerID="f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.090832 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} err="failed to get container status \"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\": rpc error: code = NotFound desc = could not find container \"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\": container with ID starting with f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.090858 4766 scope.go:117] "RemoveContainer" containerID="d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.091164 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\": container with ID starting with d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12 not found: ID does not exist" containerID="d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.091204 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} err="failed to get container status \"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\": rpc error: code = NotFound desc = could not find container \"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\": container with ID starting with d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.091241 4766 scope.go:117] "RemoveContainer" containerID="8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0" Dec 09 03:27:19 crc kubenswrapper[4766]: E1209 03:27:19.091441 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\": container with ID starting with 8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0 not found: ID does not exist" containerID="8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.091466 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0"} err="failed to get container status \"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\": rpc error: code = NotFound desc = could not find container \"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\": container with ID starting with 8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.091559 4766 scope.go:117] "RemoveContainer" containerID="a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.091803 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} err="failed to get container status \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": rpc error: code = NotFound desc = could not find container \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": container with ID starting with a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.091826 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.092037 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} err="failed to get container status \"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\": rpc error: code = NotFound desc = could not find container \"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\": container with ID starting with 961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.092059 4766 scope.go:117] "RemoveContainer" containerID="c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.092290 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} err="failed to get container status \"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\": rpc error: code = NotFound desc = could not find container \"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\": container with ID starting with c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.092314 4766 scope.go:117] "RemoveContainer" containerID="84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.092547 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} err="failed to get container status \"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\": rpc error: code = NotFound desc = could not find container \"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\": container with ID starting with 84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.092570 4766 scope.go:117] "RemoveContainer" containerID="e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.092829 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} err="failed to get container status \"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\": rpc error: code = NotFound desc = could not find container \"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\": container with ID starting with e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.092865 4766 scope.go:117] "RemoveContainer" containerID="e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.093316 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} err="failed to get container status \"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\": rpc error: code = NotFound desc = could not find container \"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\": container with ID starting with e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.093340 4766 scope.go:117] "RemoveContainer" containerID="65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.093561 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} err="failed to get container status \"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\": rpc error: code = NotFound desc = could not find container \"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\": container with ID starting with 65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.093583 4766 scope.go:117] "RemoveContainer" containerID="f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.093789 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} err="failed to get container status \"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\": rpc error: code = NotFound desc = could not find container \"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\": container with ID starting with f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.093811 4766 scope.go:117] "RemoveContainer" containerID="d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.094122 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} err="failed to get container status \"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\": rpc error: code = NotFound desc = could not find container \"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\": container with ID starting with d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.094145 4766 scope.go:117] "RemoveContainer" containerID="8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.094366 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0"} err="failed to get container status \"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\": rpc error: code = NotFound desc = could not find container \"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\": container with ID starting with 8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.094405 4766 scope.go:117] "RemoveContainer" containerID="a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.096270 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} err="failed to get container status \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": rpc error: code = NotFound desc = could not find container \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": container with ID starting with a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.096297 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.096518 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} err="failed to get container status \"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\": rpc error: code = NotFound desc = could not find container \"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\": container with ID starting with 961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.096542 4766 scope.go:117] "RemoveContainer" containerID="c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.096852 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} err="failed to get container status \"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\": rpc error: code = NotFound desc = could not find container \"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\": container with ID starting with c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.096875 4766 scope.go:117] "RemoveContainer" containerID="84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.097087 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} err="failed to get container status \"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\": rpc error: code = NotFound desc = could not find container \"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\": container with ID starting with 84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.097109 4766 scope.go:117] "RemoveContainer" containerID="e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.097336 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} err="failed to get container status \"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\": rpc error: code = NotFound desc = could not find container \"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\": container with ID starting with e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.097361 4766 scope.go:117] "RemoveContainer" containerID="e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.097594 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} err="failed to get container status \"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\": rpc error: code = NotFound desc = could not find container \"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\": container with ID starting with e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.097620 4766 scope.go:117] "RemoveContainer" containerID="65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.097792 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} err="failed to get container status \"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\": rpc error: code = NotFound desc = could not find container \"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\": container with ID starting with 65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.097815 4766 scope.go:117] "RemoveContainer" containerID="f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.098061 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} err="failed to get container status \"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\": rpc error: code = NotFound desc = could not find container \"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\": container with ID starting with f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.098083 4766 scope.go:117] "RemoveContainer" containerID="d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.098381 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} err="failed to get container status \"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\": rpc error: code = NotFound desc = could not find container \"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\": container with ID starting with d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.098422 4766 scope.go:117] "RemoveContainer" containerID="8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.098627 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0"} err="failed to get container status \"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\": rpc error: code = NotFound desc = could not find container \"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\": container with ID starting with 8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.098649 4766 scope.go:117] "RemoveContainer" containerID="a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.099194 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} err="failed to get container status \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": rpc error: code = NotFound desc = could not find container \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": container with ID starting with a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.099247 4766 scope.go:117] "RemoveContainer" containerID="961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.099477 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc"} err="failed to get container status \"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\": rpc error: code = NotFound desc = could not find container \"961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc\": container with ID starting with 961cbf46bc93e09701f8880c2429543c58dfa9101bab1a4d46f26f504d5d81cc not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.099503 4766 scope.go:117] "RemoveContainer" containerID="c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.099730 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82"} err="failed to get container status \"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\": rpc error: code = NotFound desc = could not find container \"c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82\": container with ID starting with c624fef1d8561ada69d245183cfb5cc1b06da1998ffb37cba427e6bee5252b82 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.099754 4766 scope.go:117] "RemoveContainer" containerID="84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.100020 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d"} err="failed to get container status \"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\": rpc error: code = NotFound desc = could not find container \"84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d\": container with ID starting with 84b98ffc96d301fd9aa5fd3ebe96d4901b2d7a1574d8638d86cded70837d2d2d not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.100043 4766 scope.go:117] "RemoveContainer" containerID="e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.100282 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e"} err="failed to get container status \"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\": rpc error: code = NotFound desc = could not find container \"e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e\": container with ID starting with e7692986070eb9d1065240c8e3c82cd701626b0d9924e3839a3d9e740add385e not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.100304 4766 scope.go:117] "RemoveContainer" containerID="e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.100593 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7"} err="failed to get container status \"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\": rpc error: code = NotFound desc = could not find container \"e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7\": container with ID starting with e9ba234b71014406ad7c57c7d3f85b086c4781d3cca5c6cd0908793396e417b7 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.100616 4766 scope.go:117] "RemoveContainer" containerID="65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.100807 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71"} err="failed to get container status \"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\": rpc error: code = NotFound desc = could not find container \"65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71\": container with ID starting with 65ddd662233382d9663be94bf992da8b1a0f03f69b9882dad78a5c458526de71 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.100829 4766 scope.go:117] "RemoveContainer" containerID="f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.101055 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d"} err="failed to get container status \"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\": rpc error: code = NotFound desc = could not find container \"f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d\": container with ID starting with f45cb7da053b4dec799b4fa738a92cb5e6d04b54c69dfc0ff1d200a34950090d not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.101077 4766 scope.go:117] "RemoveContainer" containerID="d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.101373 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12"} err="failed to get container status \"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\": rpc error: code = NotFound desc = could not find container \"d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12\": container with ID starting with d0476205be08aa5ef7f48fc5ef677bb9f09009b31c2557590ed2caf22773da12 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.101395 4766 scope.go:117] "RemoveContainer" containerID="8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.101704 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0"} err="failed to get container status \"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\": rpc error: code = NotFound desc = could not find container \"8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0\": container with ID starting with 8aa1703c166d62682c1c747441b5f3004f7ba3f442ddf3c7aa2ce2620ebf46f0 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.101734 4766 scope.go:117] "RemoveContainer" containerID="a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.101998 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1"} err="failed to get container status \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": rpc error: code = NotFound desc = could not find container \"a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1\": container with ID starting with a284ab39ea33174a464d08435c68b7090dd1e3200fae026184899837ced3cbf1 not found: ID does not exist" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.114438 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:19 crc kubenswrapper[4766]: W1209 03:27:19.178897 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a509a2a_0e08_4d21_a377_cbd0aebe0ba2.slice/crio-b702f96f057a2e169844a062d662a2dc54f8916814c994f835050d368147304f WatchSource:0}: Error finding container b702f96f057a2e169844a062d662a2dc54f8916814c994f835050d368147304f: Status 404 returned error can't find the container with id b702f96f057a2e169844a062d662a2dc54f8916814c994f835050d368147304f Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.912936 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/2.log" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.913382 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/1.log" Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.913441 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gx9l2" event={"ID":"c83a9d31-9c87-4a13-ab9a-2992e852eb47","Type":"ContainerStarted","Data":"fb7af166f606eacf7982bd8e366ca1063d476dd14cbf53f81c5cc6baf53fbc8e"} Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.915773 4766 generic.go:334] "Generic (PLEG): container finished" podID="4a509a2a-0e08-4d21-a377-cbd0aebe0ba2" containerID="0354578c0bc4c692fe03f9cf4732ed144d329870abf02b7c830357a63f0fc877" exitCode=0 Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.915814 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerDied","Data":"0354578c0bc4c692fe03f9cf4732ed144d329870abf02b7c830357a63f0fc877"} Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.915851 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerStarted","Data":"b702f96f057a2e169844a062d662a2dc54f8916814c994f835050d368147304f"} Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.918093 4766 generic.go:334] "Generic (PLEG): container finished" podID="63e40c3d-4950-4959-8841-11e962ffc2af" containerID="bdffd0d9a1189975916094042c81c7a5575476c8eca51c652c2fd0b936ba0df2" exitCode=0 Dec 09 03:27:19 crc kubenswrapper[4766]: I1209 03:27:19.918129 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" event={"ID":"63e40c3d-4950-4959-8841-11e962ffc2af","Type":"ContainerDied","Data":"bdffd0d9a1189975916094042c81c7a5575476c8eca51c652c2fd0b936ba0df2"} Dec 09 03:27:20 crc kubenswrapper[4766]: I1209 03:27:20.847195 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec" path="/var/lib/kubelet/pods/d2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec/volumes" Dec 09 03:27:20 crc kubenswrapper[4766]: I1209 03:27:20.928703 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerStarted","Data":"ea6e9ed66eec24762ae722c1eb1f5cc870961bdd6990adf0fbe3abcebcb233ac"} Dec 09 03:27:20 crc kubenswrapper[4766]: I1209 03:27:20.928763 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerStarted","Data":"79130e5d2f1c5c701d7d8b9d62aaf00ee4b2287109300df772d615b4b236888c"} Dec 09 03:27:20 crc kubenswrapper[4766]: I1209 03:27:20.928781 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerStarted","Data":"a55d6a146733cc3388043f277921243ceaeae6897ffa68beb4aa592fb344ebc5"} Dec 09 03:27:20 crc kubenswrapper[4766]: I1209 03:27:20.928793 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerStarted","Data":"e1574809eeddc9b89f179018b6dba63547989eab8e10d611b71ccf7887cab6ff"} Dec 09 03:27:20 crc kubenswrapper[4766]: I1209 03:27:20.928805 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerStarted","Data":"d45feec912cbb6d7a5dfd6014720f04c3309a7391d60e28ccbcbaf9b3df6e7fc"} Dec 09 03:27:20 crc kubenswrapper[4766]: I1209 03:27:20.928816 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerStarted","Data":"2692052d24109eeb25902c20d5d85c0639df6b983498e98626a373411832f8db"} Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.003702 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.088045 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-bundle\") pod \"63e40c3d-4950-4959-8841-11e962ffc2af\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.088106 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l47z\" (UniqueName: \"kubernetes.io/projected/63e40c3d-4950-4959-8841-11e962ffc2af-kube-api-access-5l47z\") pod \"63e40c3d-4950-4959-8841-11e962ffc2af\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.088137 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-util\") pod \"63e40c3d-4950-4959-8841-11e962ffc2af\" (UID: \"63e40c3d-4950-4959-8841-11e962ffc2af\") " Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.089619 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-bundle" (OuterVolumeSpecName: "bundle") pod "63e40c3d-4950-4959-8841-11e962ffc2af" (UID: "63e40c3d-4950-4959-8841-11e962ffc2af"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.093874 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e40c3d-4950-4959-8841-11e962ffc2af-kube-api-access-5l47z" (OuterVolumeSpecName: "kube-api-access-5l47z") pod "63e40c3d-4950-4959-8841-11e962ffc2af" (UID: "63e40c3d-4950-4959-8841-11e962ffc2af"). InnerVolumeSpecName "kube-api-access-5l47z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.102805 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-util" (OuterVolumeSpecName: "util") pod "63e40c3d-4950-4959-8841-11e962ffc2af" (UID: "63e40c3d-4950-4959-8841-11e962ffc2af"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.189038 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.189071 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l47z\" (UniqueName: \"kubernetes.io/projected/63e40c3d-4950-4959-8841-11e962ffc2af-kube-api-access-5l47z\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.189081 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/63e40c3d-4950-4959-8841-11e962ffc2af-util\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.936066 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" event={"ID":"63e40c3d-4950-4959-8841-11e962ffc2af","Type":"ContainerDied","Data":"cce239656c51ed2d13ca9742112989e95b0643c9d9fee88bbdce6c46cbcfb493"} Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.936112 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cce239656c51ed2d13ca9742112989e95b0643c9d9fee88bbdce6c46cbcfb493" Dec 09 03:27:21 crc kubenswrapper[4766]: I1209 03:27:21.936123 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.577972 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv"] Dec 09 03:27:23 crc kubenswrapper[4766]: E1209 03:27:23.578552 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e40c3d-4950-4959-8841-11e962ffc2af" containerName="extract" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.578571 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e40c3d-4950-4959-8841-11e962ffc2af" containerName="extract" Dec 09 03:27:23 crc kubenswrapper[4766]: E1209 03:27:23.578594 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e40c3d-4950-4959-8841-11e962ffc2af" containerName="util" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.578602 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e40c3d-4950-4959-8841-11e962ffc2af" containerName="util" Dec 09 03:27:23 crc kubenswrapper[4766]: E1209 03:27:23.578611 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e40c3d-4950-4959-8841-11e962ffc2af" containerName="pull" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.578620 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e40c3d-4950-4959-8841-11e962ffc2af" containerName="pull" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.578747 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e40c3d-4950-4959-8841-11e962ffc2af" containerName="extract" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.579194 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.580685 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.581115 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-x6nhr" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.582969 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.624346 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgsp\" (UniqueName: \"kubernetes.io/projected/2c9cee8d-ebaf-416e-8dde-e9bfddf9775a-kube-api-access-7lgsp\") pod \"nmstate-operator-5b5b58f5c8-dcnxv\" (UID: \"2c9cee8d-ebaf-416e-8dde-e9bfddf9775a\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.725305 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgsp\" (UniqueName: \"kubernetes.io/projected/2c9cee8d-ebaf-416e-8dde-e9bfddf9775a-kube-api-access-7lgsp\") pod \"nmstate-operator-5b5b58f5c8-dcnxv\" (UID: \"2c9cee8d-ebaf-416e-8dde-e9bfddf9775a\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.755395 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgsp\" (UniqueName: \"kubernetes.io/projected/2c9cee8d-ebaf-416e-8dde-e9bfddf9775a-kube-api-access-7lgsp\") pod \"nmstate-operator-5b5b58f5c8-dcnxv\" (UID: \"2c9cee8d-ebaf-416e-8dde-e9bfddf9775a\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.893823 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:23 crc kubenswrapper[4766]: E1209 03:27:23.926420 4766 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate_2c9cee8d-ebaf-416e-8dde-e9bfddf9775a_0(5ee282aa60da5a1241b851e5fb4ed8ec6e644f22fb93ebae5b84750c9826b5da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 03:27:23 crc kubenswrapper[4766]: E1209 03:27:23.926838 4766 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate_2c9cee8d-ebaf-416e-8dde-e9bfddf9775a_0(5ee282aa60da5a1241b851e5fb4ed8ec6e644f22fb93ebae5b84750c9826b5da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:23 crc kubenswrapper[4766]: E1209 03:27:23.926869 4766 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate_2c9cee8d-ebaf-416e-8dde-e9bfddf9775a_0(5ee282aa60da5a1241b851e5fb4ed8ec6e644f22fb93ebae5b84750c9826b5da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:23 crc kubenswrapper[4766]: E1209 03:27:23.926925 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate(2c9cee8d-ebaf-416e-8dde-e9bfddf9775a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate(2c9cee8d-ebaf-416e-8dde-e9bfddf9775a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate_2c9cee8d-ebaf-416e-8dde-e9bfddf9775a_0(5ee282aa60da5a1241b851e5fb4ed8ec6e644f22fb93ebae5b84750c9826b5da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" podUID="2c9cee8d-ebaf-416e-8dde-e9bfddf9775a" Dec 09 03:27:23 crc kubenswrapper[4766]: I1209 03:27:23.949656 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerStarted","Data":"0933f052020b0acb1485066a5ea53dc7c71df7b1960d0266db496eae4a5c93ad"} Dec 09 03:27:25 crc kubenswrapper[4766]: I1209 03:27:25.968937 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" event={"ID":"4a509a2a-0e08-4d21-a377-cbd0aebe0ba2","Type":"ContainerStarted","Data":"74ae87ebd06a530523a958d96c91de964ad9dbb0dd17c23176412171ccc437a9"} Dec 09 03:27:26 crc kubenswrapper[4766]: I1209 03:27:26.933909 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv"] Dec 09 03:27:26 crc kubenswrapper[4766]: I1209 03:27:26.934021 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:26 crc kubenswrapper[4766]: I1209 03:27:26.934438 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:26 crc kubenswrapper[4766]: E1209 03:27:26.959902 4766 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate_2c9cee8d-ebaf-416e-8dde-e9bfddf9775a_0(ff26e415f087f1f9f6bcef23fcfd1d5c6ce14112b56e98b497cba1cfc1a7c1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 03:27:26 crc kubenswrapper[4766]: E1209 03:27:26.960225 4766 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate_2c9cee8d-ebaf-416e-8dde-e9bfddf9775a_0(ff26e415f087f1f9f6bcef23fcfd1d5c6ce14112b56e98b497cba1cfc1a7c1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:26 crc kubenswrapper[4766]: E1209 03:27:26.960252 4766 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate_2c9cee8d-ebaf-416e-8dde-e9bfddf9775a_0(ff26e415f087f1f9f6bcef23fcfd1d5c6ce14112b56e98b497cba1cfc1a7c1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:26 crc kubenswrapper[4766]: E1209 03:27:26.960298 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate(2c9cee8d-ebaf-416e-8dde-e9bfddf9775a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate(2c9cee8d-ebaf-416e-8dde-e9bfddf9775a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-5b5b58f5c8-dcnxv_openshift-nmstate_2c9cee8d-ebaf-416e-8dde-e9bfddf9775a_0(ff26e415f087f1f9f6bcef23fcfd1d5c6ce14112b56e98b497cba1cfc1a7c1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" podUID="2c9cee8d-ebaf-416e-8dde-e9bfddf9775a" Dec 09 03:27:26 crc kubenswrapper[4766]: I1209 03:27:26.973775 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:26 crc kubenswrapper[4766]: I1209 03:27:26.973884 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.002977 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.012791 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" podStartSLOduration=9.012768504 podStartE2EDuration="9.012768504s" podCreationTimestamp="2025-12-09 03:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:27:27.012049704 +0000 UTC m=+928.721355130" watchObservedRunningTime="2025-12-09 03:27:27.012768504 +0000 UTC m=+928.722073930" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.678944 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pml89"] Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.680903 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.685846 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pml89"] Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.777606 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rqq\" (UniqueName: \"kubernetes.io/projected/3dcfb187-78b1-4f21-bbab-edff562f4033-kube-api-access-44rqq\") pod \"community-operators-pml89\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.778037 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-catalog-content\") pod \"community-operators-pml89\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.778065 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-utilities\") pod \"community-operators-pml89\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.878968 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-catalog-content\") pod \"community-operators-pml89\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.879124 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-utilities\") pod \"community-operators-pml89\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.879497 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-catalog-content\") pod \"community-operators-pml89\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.880331 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rqq\" (UniqueName: \"kubernetes.io/projected/3dcfb187-78b1-4f21-bbab-edff562f4033-kube-api-access-44rqq\") pod \"community-operators-pml89\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.880961 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-utilities\") pod \"community-operators-pml89\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.927149 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rqq\" (UniqueName: \"kubernetes.io/projected/3dcfb187-78b1-4f21-bbab-edff562f4033-kube-api-access-44rqq\") pod \"community-operators-pml89\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.980186 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:27 crc kubenswrapper[4766]: I1209 03:27:27.995546 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:28 crc kubenswrapper[4766]: I1209 03:27:28.023475 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:28 crc kubenswrapper[4766]: E1209 03:27:28.026275 4766 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-pml89_openshift-marketplace_3dcfb187-78b1-4f21-bbab-edff562f4033_0(fa9943774c65594bd9d6121f5bb020253194e511c7dc36addd5db7a0314ae8b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 03:27:28 crc kubenswrapper[4766]: E1209 03:27:28.026341 4766 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-pml89_openshift-marketplace_3dcfb187-78b1-4f21-bbab-edff562f4033_0(fa9943774c65594bd9d6121f5bb020253194e511c7dc36addd5db7a0314ae8b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:28 crc kubenswrapper[4766]: E1209 03:27:28.026365 4766 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-pml89_openshift-marketplace_3dcfb187-78b1-4f21-bbab-edff562f4033_0(fa9943774c65594bd9d6121f5bb020253194e511c7dc36addd5db7a0314ae8b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:28 crc kubenswrapper[4766]: E1209 03:27:28.026404 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-pml89_openshift-marketplace(3dcfb187-78b1-4f21-bbab-edff562f4033)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-pml89_openshift-marketplace(3dcfb187-78b1-4f21-bbab-edff562f4033)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-pml89_openshift-marketplace_3dcfb187-78b1-4f21-bbab-edff562f4033_0(fa9943774c65594bd9d6121f5bb020253194e511c7dc36addd5db7a0314ae8b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/community-operators-pml89" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" Dec 09 03:27:28 crc kubenswrapper[4766]: I1209 03:27:28.985817 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:28 crc kubenswrapper[4766]: I1209 03:27:28.986665 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:29 crc kubenswrapper[4766]: E1209 03:27:29.020418 4766 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-pml89_openshift-marketplace_3dcfb187-78b1-4f21-bbab-edff562f4033_0(6971ece4a55b14c5f2a8a95326c54233ef971b3adba1801e3fa7e59b347dbd66): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 03:27:29 crc kubenswrapper[4766]: E1209 03:27:29.020488 4766 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-pml89_openshift-marketplace_3dcfb187-78b1-4f21-bbab-edff562f4033_0(6971ece4a55b14c5f2a8a95326c54233ef971b3adba1801e3fa7e59b347dbd66): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:29 crc kubenswrapper[4766]: E1209 03:27:29.020508 4766 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-pml89_openshift-marketplace_3dcfb187-78b1-4f21-bbab-edff562f4033_0(6971ece4a55b14c5f2a8a95326c54233ef971b3adba1801e3fa7e59b347dbd66): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:29 crc kubenswrapper[4766]: E1209 03:27:29.020550 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-pml89_openshift-marketplace(3dcfb187-78b1-4f21-bbab-edff562f4033)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-pml89_openshift-marketplace(3dcfb187-78b1-4f21-bbab-edff562f4033)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-pml89_openshift-marketplace_3dcfb187-78b1-4f21-bbab-edff562f4033_0(6971ece4a55b14c5f2a8a95326c54233ef971b3adba1801e3fa7e59b347dbd66): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/community-operators-pml89" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.464700 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b52zs"] Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.468121 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.491663 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b52zs"] Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.565673 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfbsj\" (UniqueName: \"kubernetes.io/projected/5c004078-5148-40dc-9b15-a134ea8fad13-kube-api-access-pfbsj\") pod \"certified-operators-b52zs\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.566058 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-catalog-content\") pod \"certified-operators-b52zs\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.566368 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-utilities\") pod \"certified-operators-b52zs\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.667745 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfbsj\" (UniqueName: \"kubernetes.io/projected/5c004078-5148-40dc-9b15-a134ea8fad13-kube-api-access-pfbsj\") pod \"certified-operators-b52zs\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.667828 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-catalog-content\") pod \"certified-operators-b52zs\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.667926 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-utilities\") pod \"certified-operators-b52zs\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.668515 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-catalog-content\") pod \"certified-operators-b52zs\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.668711 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-utilities\") pod \"certified-operators-b52zs\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.692711 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfbsj\" (UniqueName: \"kubernetes.io/projected/5c004078-5148-40dc-9b15-a134ea8fad13-kube-api-access-pfbsj\") pod \"certified-operators-b52zs\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:34 crc kubenswrapper[4766]: I1209 03:27:34.807242 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:35 crc kubenswrapper[4766]: I1209 03:27:35.235368 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b52zs"] Dec 09 03:27:35 crc kubenswrapper[4766]: W1209 03:27:35.247474 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c004078_5148_40dc_9b15_a134ea8fad13.slice/crio-c9d0697a72a69d200e6cbf5488d4d456f6555d6a322175dc37978b684fdeed02 WatchSource:0}: Error finding container c9d0697a72a69d200e6cbf5488d4d456f6555d6a322175dc37978b684fdeed02: Status 404 returned error can't find the container with id c9d0697a72a69d200e6cbf5488d4d456f6555d6a322175dc37978b684fdeed02 Dec 09 03:27:36 crc kubenswrapper[4766]: I1209 03:27:36.051973 4766 generic.go:334] "Generic (PLEG): container finished" podID="5c004078-5148-40dc-9b15-a134ea8fad13" containerID="18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32" exitCode=0 Dec 09 03:27:36 crc kubenswrapper[4766]: I1209 03:27:36.052058 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b52zs" event={"ID":"5c004078-5148-40dc-9b15-a134ea8fad13","Type":"ContainerDied","Data":"18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32"} Dec 09 03:27:36 crc kubenswrapper[4766]: I1209 03:27:36.052126 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b52zs" event={"ID":"5c004078-5148-40dc-9b15-a134ea8fad13","Type":"ContainerStarted","Data":"c9d0697a72a69d200e6cbf5488d4d456f6555d6a322175dc37978b684fdeed02"} Dec 09 03:27:37 crc kubenswrapper[4766]: I1209 03:27:37.059203 4766 generic.go:334] "Generic (PLEG): container finished" podID="5c004078-5148-40dc-9b15-a134ea8fad13" containerID="0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42" exitCode=0 Dec 09 03:27:37 crc kubenswrapper[4766]: I1209 03:27:37.059302 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b52zs" event={"ID":"5c004078-5148-40dc-9b15-a134ea8fad13","Type":"ContainerDied","Data":"0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42"} Dec 09 03:27:37 crc kubenswrapper[4766]: I1209 03:27:37.316445 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:27:37 crc kubenswrapper[4766]: I1209 03:27:37.316771 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:27:38 crc kubenswrapper[4766]: I1209 03:27:38.069023 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b52zs" event={"ID":"5c004078-5148-40dc-9b15-a134ea8fad13","Type":"ContainerStarted","Data":"a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879"} Dec 09 03:27:38 crc kubenswrapper[4766]: I1209 03:27:38.097689 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b52zs" podStartSLOduration=2.69721491 podStartE2EDuration="4.097671168s" podCreationTimestamp="2025-12-09 03:27:34 +0000 UTC" firstStartedPulling="2025-12-09 03:27:36.054145758 +0000 UTC m=+937.763451224" lastFinishedPulling="2025-12-09 03:27:37.454602056 +0000 UTC m=+939.163907482" observedRunningTime="2025-12-09 03:27:38.094313527 +0000 UTC m=+939.803618993" watchObservedRunningTime="2025-12-09 03:27:38.097671168 +0000 UTC m=+939.806976594" Dec 09 03:27:39 crc kubenswrapper[4766]: I1209 03:27:39.839111 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:39 crc kubenswrapper[4766]: I1209 03:27:39.839951 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" Dec 09 03:27:40 crc kubenswrapper[4766]: I1209 03:27:40.240369 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv"] Dec 09 03:27:40 crc kubenswrapper[4766]: W1209 03:27:40.248897 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c9cee8d_ebaf_416e_8dde_e9bfddf9775a.slice/crio-8342a9c18f5855a4e49702069f1b0934771f80bd86bc2930bf427b6180f73585 WatchSource:0}: Error finding container 8342a9c18f5855a4e49702069f1b0934771f80bd86bc2930bf427b6180f73585: Status 404 returned error can't find the container with id 8342a9c18f5855a4e49702069f1b0934771f80bd86bc2930bf427b6180f73585 Dec 09 03:27:40 crc kubenswrapper[4766]: I1209 03:27:40.838910 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:40 crc kubenswrapper[4766]: I1209 03:27:40.839876 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:41 crc kubenswrapper[4766]: I1209 03:27:41.089546 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" event={"ID":"2c9cee8d-ebaf-416e-8dde-e9bfddf9775a","Type":"ContainerStarted","Data":"8342a9c18f5855a4e49702069f1b0934771f80bd86bc2930bf427b6180f73585"} Dec 09 03:27:41 crc kubenswrapper[4766]: I1209 03:27:41.139281 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pml89"] Dec 09 03:27:41 crc kubenswrapper[4766]: W1209 03:27:41.144042 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcfb187_78b1_4f21_bbab_edff562f4033.slice/crio-fcd2aaac9d4d51ce2526c11f38a7842540d238976ee88834f90ab6e60352f67a WatchSource:0}: Error finding container fcd2aaac9d4d51ce2526c11f38a7842540d238976ee88834f90ab6e60352f67a: Status 404 returned error can't find the container with id fcd2aaac9d4d51ce2526c11f38a7842540d238976ee88834f90ab6e60352f67a Dec 09 03:27:42 crc kubenswrapper[4766]: I1209 03:27:42.105145 4766 generic.go:334] "Generic (PLEG): container finished" podID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerID="d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b" exitCode=0 Dec 09 03:27:42 crc kubenswrapper[4766]: I1209 03:27:42.105680 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pml89" event={"ID":"3dcfb187-78b1-4f21-bbab-edff562f4033","Type":"ContainerDied","Data":"d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b"} Dec 09 03:27:42 crc kubenswrapper[4766]: I1209 03:27:42.106023 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pml89" event={"ID":"3dcfb187-78b1-4f21-bbab-edff562f4033","Type":"ContainerStarted","Data":"fcd2aaac9d4d51ce2526c11f38a7842540d238976ee88834f90ab6e60352f67a"} Dec 09 03:27:42 crc kubenswrapper[4766]: I1209 03:27:42.112179 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" event={"ID":"2c9cee8d-ebaf-416e-8dde-e9bfddf9775a","Type":"ContainerStarted","Data":"6d260fca8c648ec7794ebccbc17bf2a991124bfef9d4b1284264db84582a2f75"} Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.119793 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pml89" event={"ID":"3dcfb187-78b1-4f21-bbab-edff562f4033","Type":"ContainerStarted","Data":"330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2"} Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.142520 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dcnxv" podStartSLOduration=18.589995793 podStartE2EDuration="20.142501356s" podCreationTimestamp="2025-12-09 03:27:23 +0000 UTC" firstStartedPulling="2025-12-09 03:27:40.252004526 +0000 UTC m=+941.961309952" lastFinishedPulling="2025-12-09 03:27:41.804510089 +0000 UTC m=+943.513815515" observedRunningTime="2025-12-09 03:27:42.1546904 +0000 UTC m=+943.863995856" watchObservedRunningTime="2025-12-09 03:27:43.142501356 +0000 UTC m=+944.851806792" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.163406 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd"] Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.164189 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.169059 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-f4jbm" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.174904 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd"] Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.176160 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.184974 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.235398 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd"] Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.248291 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd"] Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.268845 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-j2cd4"] Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.269643 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.309348 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb"] Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.309992 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.312023 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.312246 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-7fb6v" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.312446 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.323796 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb"] Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.334175 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c191369f-2896-4aa5-9a85-19fe4ba4ba6d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-wt8wd\" (UID: \"c191369f-2896-4aa5-9a85-19fe4ba4ba6d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.334736 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbww\" (UniqueName: \"kubernetes.io/projected/c191369f-2896-4aa5-9a85-19fe4ba4ba6d-kube-api-access-tgbww\") pod \"nmstate-webhook-5f6d4c5ccb-wt8wd\" (UID: \"c191369f-2896-4aa5-9a85-19fe4ba4ba6d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.334796 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9sts\" (UniqueName: \"kubernetes.io/projected/f1348766-b8a2-4754-88b9-078bf2081465-kube-api-access-t9sts\") pod \"nmstate-metrics-7f946cbc9-m9cmd\" (UID: \"f1348766-b8a2-4754-88b9-078bf2081465\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.435822 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9cba229b-a520-4e88-bf57-d2418eedc7bd-ovs-socket\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.435880 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbww\" (UniqueName: \"kubernetes.io/projected/c191369f-2896-4aa5-9a85-19fe4ba4ba6d-kube-api-access-tgbww\") pod \"nmstate-webhook-5f6d4c5ccb-wt8wd\" (UID: \"c191369f-2896-4aa5-9a85-19fe4ba4ba6d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.435914 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9cba229b-a520-4e88-bf57-d2418eedc7bd-nmstate-lock\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.435940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e456122d-6335-472c-a630-18cbf88d4352-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.436155 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9cba229b-a520-4e88-bf57-d2418eedc7bd-dbus-socket\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.436198 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9sts\" (UniqueName: \"kubernetes.io/projected/f1348766-b8a2-4754-88b9-078bf2081465-kube-api-access-t9sts\") pod \"nmstate-metrics-7f946cbc9-m9cmd\" (UID: \"f1348766-b8a2-4754-88b9-078bf2081465\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.436263 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e456122d-6335-472c-a630-18cbf88d4352-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.436313 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c191369f-2896-4aa5-9a85-19fe4ba4ba6d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-wt8wd\" (UID: \"c191369f-2896-4aa5-9a85-19fe4ba4ba6d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.436336 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qbb\" (UniqueName: \"kubernetes.io/projected/9cba229b-a520-4e88-bf57-d2418eedc7bd-kube-api-access-g8qbb\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.436365 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgbb\" (UniqueName: \"kubernetes.io/projected/e456122d-6335-472c-a630-18cbf88d4352-kube-api-access-lmgbb\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.443012 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c191369f-2896-4aa5-9a85-19fe4ba4ba6d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-wt8wd\" (UID: \"c191369f-2896-4aa5-9a85-19fe4ba4ba6d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.453542 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbww\" (UniqueName: \"kubernetes.io/projected/c191369f-2896-4aa5-9a85-19fe4ba4ba6d-kube-api-access-tgbww\") pod \"nmstate-webhook-5f6d4c5ccb-wt8wd\" (UID: \"c191369f-2896-4aa5-9a85-19fe4ba4ba6d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.459512 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9sts\" (UniqueName: \"kubernetes.io/projected/f1348766-b8a2-4754-88b9-078bf2081465-kube-api-access-t9sts\") pod \"nmstate-metrics-7f946cbc9-m9cmd\" (UID: \"f1348766-b8a2-4754-88b9-078bf2081465\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.468081 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-bbcc9b596-pnrnf"] Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.468773 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.536928 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bbcc9b596-pnrnf"] Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.537955 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-service-ca\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538020 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e456122d-6335-472c-a630-18cbf88d4352-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538072 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/83662fc7-8654-481f-96cd-17e254e12e1f-console-oauth-config\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538117 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4gwx\" (UniqueName: \"kubernetes.io/projected/83662fc7-8654-481f-96cd-17e254e12e1f-kube-api-access-w4gwx\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538144 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-console-config\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538174 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8qbb\" (UniqueName: \"kubernetes.io/projected/9cba229b-a520-4e88-bf57-d2418eedc7bd-kube-api-access-g8qbb\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/83662fc7-8654-481f-96cd-17e254e12e1f-console-serving-cert\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: E1209 03:27:43.538255 4766 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 09 03:27:43 crc kubenswrapper[4766]: E1209 03:27:43.538358 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e456122d-6335-472c-a630-18cbf88d4352-plugin-serving-cert podName:e456122d-6335-472c-a630-18cbf88d4352 nodeName:}" failed. No retries permitted until 2025-12-09 03:27:44.038332725 +0000 UTC m=+945.747638251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e456122d-6335-472c-a630-18cbf88d4352-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-sfpqb" (UID: "e456122d-6335-472c-a630-18cbf88d4352") : secret "plugin-serving-cert" not found Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538391 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgbb\" (UniqueName: \"kubernetes.io/projected/e456122d-6335-472c-a630-18cbf88d4352-kube-api-access-lmgbb\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538422 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9cba229b-a520-4e88-bf57-d2418eedc7bd-ovs-socket\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538469 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9cba229b-a520-4e88-bf57-d2418eedc7bd-nmstate-lock\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538500 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-oauth-serving-cert\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538525 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e456122d-6335-472c-a630-18cbf88d4352-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538551 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9cba229b-a520-4e88-bf57-d2418eedc7bd-nmstate-lock\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538600 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9cba229b-a520-4e88-bf57-d2418eedc7bd-ovs-socket\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538560 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9cba229b-a520-4e88-bf57-d2418eedc7bd-dbus-socket\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538644 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-trusted-ca-bundle\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.538836 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9cba229b-a520-4e88-bf57-d2418eedc7bd-dbus-socket\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.539540 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e456122d-6335-472c-a630-18cbf88d4352-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.542539 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.555377 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.559680 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgbb\" (UniqueName: \"kubernetes.io/projected/e456122d-6335-472c-a630-18cbf88d4352-kube-api-access-lmgbb\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.563087 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8qbb\" (UniqueName: \"kubernetes.io/projected/9cba229b-a520-4e88-bf57-d2418eedc7bd-kube-api-access-g8qbb\") pod \"nmstate-handler-j2cd4\" (UID: \"9cba229b-a520-4e88-bf57-d2418eedc7bd\") " pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.627694 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.640270 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-trusted-ca-bundle\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.640325 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-service-ca\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.640370 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/83662fc7-8654-481f-96cd-17e254e12e1f-console-oauth-config\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.640394 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4gwx\" (UniqueName: \"kubernetes.io/projected/83662fc7-8654-481f-96cd-17e254e12e1f-kube-api-access-w4gwx\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.640414 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-console-config\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.640434 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/83662fc7-8654-481f-96cd-17e254e12e1f-console-serving-cert\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.640468 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-oauth-serving-cert\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.641250 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-oauth-serving-cert\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.642103 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-trusted-ca-bundle\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.642672 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-service-ca\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.644265 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/83662fc7-8654-481f-96cd-17e254e12e1f-console-config\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.646612 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/83662fc7-8654-481f-96cd-17e254e12e1f-console-serving-cert\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.647133 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/83662fc7-8654-481f-96cd-17e254e12e1f-console-oauth-config\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.662921 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4gwx\" (UniqueName: \"kubernetes.io/projected/83662fc7-8654-481f-96cd-17e254e12e1f-kube-api-access-w4gwx\") pod \"console-bbcc9b596-pnrnf\" (UID: \"83662fc7-8654-481f-96cd-17e254e12e1f\") " pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.751544 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd"] Dec 09 03:27:43 crc kubenswrapper[4766]: W1209 03:27:43.760079 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc191369f_2896_4aa5_9a85_19fe4ba4ba6d.slice/crio-94e7bfe673ceaa6ce29d2eef79f52709bb9e21e922d84b0a2643efcc68bebe71 WatchSource:0}: Error finding container 94e7bfe673ceaa6ce29d2eef79f52709bb9e21e922d84b0a2643efcc68bebe71: Status 404 returned error can't find the container with id 94e7bfe673ceaa6ce29d2eef79f52709bb9e21e922d84b0a2643efcc68bebe71 Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.792954 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.809882 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd"] Dec 09 03:27:43 crc kubenswrapper[4766]: W1209 03:27:43.818002 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1348766_b8a2_4754_88b9_078bf2081465.slice/crio-06252218cf966f6645254dcd16fa599dbeedac1cb2b5c621969096fc7f4a3de4 WatchSource:0}: Error finding container 06252218cf966f6645254dcd16fa599dbeedac1cb2b5c621969096fc7f4a3de4: Status 404 returned error can't find the container with id 06252218cf966f6645254dcd16fa599dbeedac1cb2b5c621969096fc7f4a3de4 Dec 09 03:27:43 crc kubenswrapper[4766]: I1209 03:27:43.972582 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bbcc9b596-pnrnf"] Dec 09 03:27:43 crc kubenswrapper[4766]: W1209 03:27:43.978161 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83662fc7_8654_481f_96cd_17e254e12e1f.slice/crio-e7fbc5423ab3ad74875cd1e5a550e54490c327cbacef8c621f8292499af41de7 WatchSource:0}: Error finding container e7fbc5423ab3ad74875cd1e5a550e54490c327cbacef8c621f8292499af41de7: Status 404 returned error can't find the container with id e7fbc5423ab3ad74875cd1e5a550e54490c327cbacef8c621f8292499af41de7 Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.043509 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e456122d-6335-472c-a630-18cbf88d4352-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.048156 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e456122d-6335-472c-a630-18cbf88d4352-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-sfpqb\" (UID: \"e456122d-6335-472c-a630-18cbf88d4352\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.125201 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-j2cd4" event={"ID":"9cba229b-a520-4e88-bf57-d2418eedc7bd","Type":"ContainerStarted","Data":"3fbc6a4aaba3e9886db5997b9bc24294f6fac2e499884c406e4cc2de38fd79a9"} Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.126412 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd" event={"ID":"f1348766-b8a2-4754-88b9-078bf2081465","Type":"ContainerStarted","Data":"06252218cf966f6645254dcd16fa599dbeedac1cb2b5c621969096fc7f4a3de4"} Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.127717 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bbcc9b596-pnrnf" event={"ID":"83662fc7-8654-481f-96cd-17e254e12e1f","Type":"ContainerStarted","Data":"aa5ca87325053f317aae44fefeed947210de95084bdda8c81677e84945499272"} Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.127753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bbcc9b596-pnrnf" event={"ID":"83662fc7-8654-481f-96cd-17e254e12e1f","Type":"ContainerStarted","Data":"e7fbc5423ab3ad74875cd1e5a550e54490c327cbacef8c621f8292499af41de7"} Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.130123 4766 generic.go:334] "Generic (PLEG): container finished" podID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerID="330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2" exitCode=0 Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.130175 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pml89" event={"ID":"3dcfb187-78b1-4f21-bbab-edff562f4033","Type":"ContainerDied","Data":"330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2"} Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.131576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" event={"ID":"c191369f-2896-4aa5-9a85-19fe4ba4ba6d","Type":"ContainerStarted","Data":"94e7bfe673ceaa6ce29d2eef79f52709bb9e21e922d84b0a2643efcc68bebe71"} Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.153912 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bbcc9b596-pnrnf" podStartSLOduration=1.153886087 podStartE2EDuration="1.153886087s" podCreationTimestamp="2025-12-09 03:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:27:44.148585085 +0000 UTC m=+945.857890521" watchObservedRunningTime="2025-12-09 03:27:44.153886087 +0000 UTC m=+945.863191513" Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.236925 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.696259 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb"] Dec 09 03:27:44 crc kubenswrapper[4766]: W1209 03:27:44.708224 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode456122d_6335_472c_a630_18cbf88d4352.slice/crio-751c303d4ea457759700df62be133574c6f89e004b462b00fcd17ac8baae5f5b WatchSource:0}: Error finding container 751c303d4ea457759700df62be133574c6f89e004b462b00fcd17ac8baae5f5b: Status 404 returned error can't find the container with id 751c303d4ea457759700df62be133574c6f89e004b462b00fcd17ac8baae5f5b Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.808893 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.808941 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:44 crc kubenswrapper[4766]: I1209 03:27:44.859514 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:45 crc kubenswrapper[4766]: I1209 03:27:45.139404 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pml89" event={"ID":"3dcfb187-78b1-4f21-bbab-edff562f4033","Type":"ContainerStarted","Data":"3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d"} Dec 09 03:27:45 crc kubenswrapper[4766]: I1209 03:27:45.142238 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" event={"ID":"e456122d-6335-472c-a630-18cbf88d4352","Type":"ContainerStarted","Data":"751c303d4ea457759700df62be133574c6f89e004b462b00fcd17ac8baae5f5b"} Dec 09 03:27:45 crc kubenswrapper[4766]: I1209 03:27:45.160147 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pml89" podStartSLOduration=15.752740972 podStartE2EDuration="18.16011281s" podCreationTimestamp="2025-12-09 03:27:27 +0000 UTC" firstStartedPulling="2025-12-09 03:27:42.108160549 +0000 UTC m=+943.817465985" lastFinishedPulling="2025-12-09 03:27:44.515532397 +0000 UTC m=+946.224837823" observedRunningTime="2025-12-09 03:27:45.158189668 +0000 UTC m=+946.867495094" watchObservedRunningTime="2025-12-09 03:27:45.16011281 +0000 UTC m=+946.869418236" Dec 09 03:27:45 crc kubenswrapper[4766]: I1209 03:27:45.189923 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:46 crc kubenswrapper[4766]: I1209 03:27:46.828287 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b52zs"] Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.158784 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" event={"ID":"c191369f-2896-4aa5-9a85-19fe4ba4ba6d","Type":"ContainerStarted","Data":"cadaa7b848936a7910782a1e64fae95831747dfc5c5e3b4318abd12135e8d49e"} Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.159071 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.161237 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-j2cd4" event={"ID":"9cba229b-a520-4e88-bf57-d2418eedc7bd","Type":"ContainerStarted","Data":"13a2aed2014cfef3ea7a3a4e31f1f90463b735e80c4e7c52453eada7145b7ed4"} Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.161388 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.163413 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd" event={"ID":"f1348766-b8a2-4754-88b9-078bf2081465","Type":"ContainerStarted","Data":"d04138eed32b5c633627ee1989e53f9f058c3403352cecbd8adc078c5ecbe0c9"} Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.163494 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b52zs" podUID="5c004078-5148-40dc-9b15-a134ea8fad13" containerName="registry-server" containerID="cri-o://a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879" gracePeriod=2 Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.196072 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" podStartSLOduration=1.815490828 podStartE2EDuration="4.196050394s" podCreationTimestamp="2025-12-09 03:27:43 +0000 UTC" firstStartedPulling="2025-12-09 03:27:43.761982666 +0000 UTC m=+945.471288112" lastFinishedPulling="2025-12-09 03:27:46.142542242 +0000 UTC m=+947.851847678" observedRunningTime="2025-12-09 03:27:47.17577211 +0000 UTC m=+948.885077536" watchObservedRunningTime="2025-12-09 03:27:47.196050394 +0000 UTC m=+948.905355820" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.202471 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-j2cd4" podStartSLOduration=1.7956911359999999 podStartE2EDuration="4.202456737s" podCreationTimestamp="2025-12-09 03:27:43 +0000 UTC" firstStartedPulling="2025-12-09 03:27:43.660419145 +0000 UTC m=+945.369724571" lastFinishedPulling="2025-12-09 03:27:46.067184746 +0000 UTC m=+947.776490172" observedRunningTime="2025-12-09 03:27:47.196412505 +0000 UTC m=+948.905717941" watchObservedRunningTime="2025-12-09 03:27:47.202456737 +0000 UTC m=+948.911762153" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.501045 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.688794 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-catalog-content\") pod \"5c004078-5148-40dc-9b15-a134ea8fad13\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.688897 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfbsj\" (UniqueName: \"kubernetes.io/projected/5c004078-5148-40dc-9b15-a134ea8fad13-kube-api-access-pfbsj\") pod \"5c004078-5148-40dc-9b15-a134ea8fad13\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.688991 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-utilities\") pod \"5c004078-5148-40dc-9b15-a134ea8fad13\" (UID: \"5c004078-5148-40dc-9b15-a134ea8fad13\") " Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.690020 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-utilities" (OuterVolumeSpecName: "utilities") pod "5c004078-5148-40dc-9b15-a134ea8fad13" (UID: "5c004078-5148-40dc-9b15-a134ea8fad13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.694941 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c004078-5148-40dc-9b15-a134ea8fad13-kube-api-access-pfbsj" (OuterVolumeSpecName: "kube-api-access-pfbsj") pod "5c004078-5148-40dc-9b15-a134ea8fad13" (UID: "5c004078-5148-40dc-9b15-a134ea8fad13"). InnerVolumeSpecName "kube-api-access-pfbsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.737311 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c004078-5148-40dc-9b15-a134ea8fad13" (UID: "5c004078-5148-40dc-9b15-a134ea8fad13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.790232 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.790261 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfbsj\" (UniqueName: \"kubernetes.io/projected/5c004078-5148-40dc-9b15-a134ea8fad13-kube-api-access-pfbsj\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.790270 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c004078-5148-40dc-9b15-a134ea8fad13-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.996753 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:47 crc kubenswrapper[4766]: I1209 03:27:47.996812 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.066160 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.173746 4766 generic.go:334] "Generic (PLEG): container finished" podID="5c004078-5148-40dc-9b15-a134ea8fad13" containerID="a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879" exitCode=0 Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.173845 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b52zs" event={"ID":"5c004078-5148-40dc-9b15-a134ea8fad13","Type":"ContainerDied","Data":"a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879"} Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.173870 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b52zs" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.173879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b52zs" event={"ID":"5c004078-5148-40dc-9b15-a134ea8fad13","Type":"ContainerDied","Data":"c9d0697a72a69d200e6cbf5488d4d456f6555d6a322175dc37978b684fdeed02"} Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.173893 4766 scope.go:117] "RemoveContainer" containerID="a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.175971 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" event={"ID":"e456122d-6335-472c-a630-18cbf88d4352","Type":"ContainerStarted","Data":"e8e9c28c9eb42df5ff1400879c28d9c3cfb72b6422ed43d490f0c89d1b175bf5"} Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.196904 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-sfpqb" podStartSLOduration=2.8465261870000003 podStartE2EDuration="5.196884932s" podCreationTimestamp="2025-12-09 03:27:43 +0000 UTC" firstStartedPulling="2025-12-09 03:27:44.719025785 +0000 UTC m=+946.428331211" lastFinishedPulling="2025-12-09 03:27:47.06938453 +0000 UTC m=+948.778689956" observedRunningTime="2025-12-09 03:27:48.195377711 +0000 UTC m=+949.904683147" watchObservedRunningTime="2025-12-09 03:27:48.196884932 +0000 UTC m=+949.906190378" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.227975 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b52zs"] Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.231515 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b52zs"] Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.622048 4766 scope.go:117] "RemoveContainer" containerID="0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.672117 4766 scope.go:117] "RemoveContainer" containerID="18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.685956 4766 scope.go:117] "RemoveContainer" containerID="a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879" Dec 09 03:27:48 crc kubenswrapper[4766]: E1209 03:27:48.686398 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879\": container with ID starting with a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879 not found: ID does not exist" containerID="a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.686445 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879"} err="failed to get container status \"a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879\": rpc error: code = NotFound desc = could not find container \"a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879\": container with ID starting with a51d1319508bc1afd2b3f652b8714c6a0587eddd4889e0ef6f6702b41df65879 not found: ID does not exist" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.686468 4766 scope.go:117] "RemoveContainer" containerID="0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42" Dec 09 03:27:48 crc kubenswrapper[4766]: E1209 03:27:48.686957 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42\": container with ID starting with 0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42 not found: ID does not exist" containerID="0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.686978 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42"} err="failed to get container status \"0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42\": rpc error: code = NotFound desc = could not find container \"0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42\": container with ID starting with 0e13aea45c96f61573842ead66b61315449fc21ce6c06e76038d378f44f8bf42 not found: ID does not exist" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.686991 4766 scope.go:117] "RemoveContainer" containerID="18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32" Dec 09 03:27:48 crc kubenswrapper[4766]: E1209 03:27:48.687318 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32\": container with ID starting with 18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32 not found: ID does not exist" containerID="18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.687336 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32"} err="failed to get container status \"18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32\": rpc error: code = NotFound desc = could not find container \"18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32\": container with ID starting with 18699ecac63e76319ad416d1f47effc570128138ccf5897896789fed03b9bd32 not found: ID does not exist" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.848276 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c004078-5148-40dc-9b15-a134ea8fad13" path="/var/lib/kubelet/pods/5c004078-5148-40dc-9b15-a134ea8fad13/volumes" Dec 09 03:27:48 crc kubenswrapper[4766]: I1209 03:27:48.877387 4766 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podd2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podd2c7a4fd-d2ea-4d01-85c3-beb2be74d9ec] : Timed out while waiting for systemd to remove kubepods-burstable-podd2c7a4fd_d2ea_4d01_85c3_beb2be74d9ec.slice" Dec 09 03:27:49 crc kubenswrapper[4766]: I1209 03:27:49.153077 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mjnb6" Dec 09 03:27:49 crc kubenswrapper[4766]: I1209 03:27:49.209695 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd" event={"ID":"f1348766-b8a2-4754-88b9-078bf2081465","Type":"ContainerStarted","Data":"2b3d7363822b021c90344c1b36246173d73115f9b578edc966442b9060ce61cc"} Dec 09 03:27:49 crc kubenswrapper[4766]: I1209 03:27:49.243123 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m9cmd" podStartSLOduration=1.374448555 podStartE2EDuration="6.243107109s" podCreationTimestamp="2025-12-09 03:27:43 +0000 UTC" firstStartedPulling="2025-12-09 03:27:43.819953303 +0000 UTC m=+945.529258729" lastFinishedPulling="2025-12-09 03:27:48.688611857 +0000 UTC m=+950.397917283" observedRunningTime="2025-12-09 03:27:49.241022193 +0000 UTC m=+950.950327649" watchObservedRunningTime="2025-12-09 03:27:49.243107109 +0000 UTC m=+950.952412535" Dec 09 03:27:53 crc kubenswrapper[4766]: I1209 03:27:53.668357 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-j2cd4" Dec 09 03:27:53 crc kubenswrapper[4766]: I1209 03:27:53.794076 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:53 crc kubenswrapper[4766]: I1209 03:27:53.794130 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:53 crc kubenswrapper[4766]: I1209 03:27:53.800185 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:54 crc kubenswrapper[4766]: I1209 03:27:54.248602 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bbcc9b596-pnrnf" Dec 09 03:27:54 crc kubenswrapper[4766]: I1209 03:27:54.303150 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ss24g"] Dec 09 03:27:58 crc kubenswrapper[4766]: I1209 03:27:58.051311 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:58 crc kubenswrapper[4766]: I1209 03:27:58.095457 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pml89"] Dec 09 03:27:58 crc kubenswrapper[4766]: I1209 03:27:58.274146 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pml89" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerName="registry-server" containerID="cri-o://3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d" gracePeriod=2 Dec 09 03:27:59 crc kubenswrapper[4766]: I1209 03:27:59.328979 4766 scope.go:117] "RemoveContainer" containerID="1274a16117cd4f128e904eee38ce30b0880c07c56e0af593398c05bb0ec319f4" Dec 09 03:27:59 crc kubenswrapper[4766]: I1209 03:27:59.892713 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:27:59 crc kubenswrapper[4766]: I1209 03:27:59.986028 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-utilities\") pod \"3dcfb187-78b1-4f21-bbab-edff562f4033\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " Dec 09 03:27:59 crc kubenswrapper[4766]: I1209 03:27:59.986112 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44rqq\" (UniqueName: \"kubernetes.io/projected/3dcfb187-78b1-4f21-bbab-edff562f4033-kube-api-access-44rqq\") pod \"3dcfb187-78b1-4f21-bbab-edff562f4033\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " Dec 09 03:27:59 crc kubenswrapper[4766]: I1209 03:27:59.986164 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-catalog-content\") pod \"3dcfb187-78b1-4f21-bbab-edff562f4033\" (UID: \"3dcfb187-78b1-4f21-bbab-edff562f4033\") " Dec 09 03:27:59 crc kubenswrapper[4766]: I1209 03:27:59.987444 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-utilities" (OuterVolumeSpecName: "utilities") pod "3dcfb187-78b1-4f21-bbab-edff562f4033" (UID: "3dcfb187-78b1-4f21-bbab-edff562f4033"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:27:59 crc kubenswrapper[4766]: I1209 03:27:59.995375 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dcfb187-78b1-4f21-bbab-edff562f4033-kube-api-access-44rqq" (OuterVolumeSpecName: "kube-api-access-44rqq") pod "3dcfb187-78b1-4f21-bbab-edff562f4033" (UID: "3dcfb187-78b1-4f21-bbab-edff562f4033"). InnerVolumeSpecName "kube-api-access-44rqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.053405 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3dcfb187-78b1-4f21-bbab-edff562f4033" (UID: "3dcfb187-78b1-4f21-bbab-edff562f4033"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.092344 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.092578 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44rqq\" (UniqueName: \"kubernetes.io/projected/3dcfb187-78b1-4f21-bbab-edff562f4033-kube-api-access-44rqq\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.092668 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dcfb187-78b1-4f21-bbab-edff562f4033-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.288051 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gx9l2_c83a9d31-9c87-4a13-ab9a-2992e852eb47/kube-multus/2.log" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.293945 4766 generic.go:334] "Generic (PLEG): container finished" podID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerID="3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d" exitCode=0 Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.294017 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pml89" event={"ID":"3dcfb187-78b1-4f21-bbab-edff562f4033","Type":"ContainerDied","Data":"3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d"} Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.294060 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pml89" event={"ID":"3dcfb187-78b1-4f21-bbab-edff562f4033","Type":"ContainerDied","Data":"fcd2aaac9d4d51ce2526c11f38a7842540d238976ee88834f90ab6e60352f67a"} Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.294093 4766 scope.go:117] "RemoveContainer" containerID="3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.294281 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pml89" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.318116 4766 scope.go:117] "RemoveContainer" containerID="330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.338295 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pml89"] Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.342512 4766 scope.go:117] "RemoveContainer" containerID="d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.345904 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pml89"] Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.390369 4766 scope.go:117] "RemoveContainer" containerID="3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d" Dec 09 03:28:00 crc kubenswrapper[4766]: E1209 03:28:00.390969 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d\": container with ID starting with 3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d not found: ID does not exist" containerID="3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.391144 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d"} err="failed to get container status \"3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d\": rpc error: code = NotFound desc = could not find container \"3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d\": container with ID starting with 3de8923e484957307f8e1f2fac6a6e180f2496dafcb493781e9dd2044f32c94d not found: ID does not exist" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.391256 4766 scope.go:117] "RemoveContainer" containerID="330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2" Dec 09 03:28:00 crc kubenswrapper[4766]: E1209 03:28:00.392505 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2\": container with ID starting with 330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2 not found: ID does not exist" containerID="330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.392629 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2"} err="failed to get container status \"330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2\": rpc error: code = NotFound desc = could not find container \"330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2\": container with ID starting with 330d913c030f2aa27c8f1636d57588ddb776311a2ccea229c60c10f1ffbae2c2 not found: ID does not exist" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.392717 4766 scope.go:117] "RemoveContainer" containerID="d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b" Dec 09 03:28:00 crc kubenswrapper[4766]: E1209 03:28:00.393658 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b\": container with ID starting with d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b not found: ID does not exist" containerID="d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.393690 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b"} err="failed to get container status \"d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b\": rpc error: code = NotFound desc = could not find container \"d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b\": container with ID starting with d8692dc2515a341424020b5dc60b063681315e0bd612d9002006aa0daee0bb8b not found: ID does not exist" Dec 09 03:28:00 crc kubenswrapper[4766]: I1209 03:28:00.848306 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" path="/var/lib/kubelet/pods/3dcfb187-78b1-4f21-bbab-edff562f4033/volumes" Dec 09 03:28:03 crc kubenswrapper[4766]: I1209 03:28:03.565040 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-wt8wd" Dec 09 03:28:07 crc kubenswrapper[4766]: I1209 03:28:07.316322 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:28:07 crc kubenswrapper[4766]: I1209 03:28:07.316585 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:28:07 crc kubenswrapper[4766]: I1209 03:28:07.316635 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:28:07 crc kubenswrapper[4766]: I1209 03:28:07.317080 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f05dd63b19e38608a1f279fa1bc992ee750c24d1fb42681cf4619e9e86acf2d"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:28:07 crc kubenswrapper[4766]: I1209 03:28:07.317132 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://2f05dd63b19e38608a1f279fa1bc992ee750c24d1fb42681cf4619e9e86acf2d" gracePeriod=600 Dec 09 03:28:08 crc kubenswrapper[4766]: I1209 03:28:08.350372 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="2f05dd63b19e38608a1f279fa1bc992ee750c24d1fb42681cf4619e9e86acf2d" exitCode=0 Dec 09 03:28:08 crc kubenswrapper[4766]: I1209 03:28:08.350429 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"2f05dd63b19e38608a1f279fa1bc992ee750c24d1fb42681cf4619e9e86acf2d"} Dec 09 03:28:08 crc kubenswrapper[4766]: I1209 03:28:08.350789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"6a0a8a7bff8971534685d42f33b8fda759b536f0edd8ff1b38fb2ef750399fde"} Dec 09 03:28:08 crc kubenswrapper[4766]: I1209 03:28:08.350825 4766 scope.go:117] "RemoveContainer" containerID="d129c895ab3a94eff9c52289bd9c855b60f720ebf245b70e4c33b5f5b16f3734" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.219797 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc"] Dec 09 03:28:16 crc kubenswrapper[4766]: E1209 03:28:16.220496 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c004078-5148-40dc-9b15-a134ea8fad13" containerName="extract-content" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.220508 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c004078-5148-40dc-9b15-a134ea8fad13" containerName="extract-content" Dec 09 03:28:16 crc kubenswrapper[4766]: E1209 03:28:16.220517 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerName="extract-content" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.220523 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerName="extract-content" Dec 09 03:28:16 crc kubenswrapper[4766]: E1209 03:28:16.220537 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c004078-5148-40dc-9b15-a134ea8fad13" containerName="registry-server" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.220545 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c004078-5148-40dc-9b15-a134ea8fad13" containerName="registry-server" Dec 09 03:28:16 crc kubenswrapper[4766]: E1209 03:28:16.220553 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerName="extract-utilities" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.220559 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerName="extract-utilities" Dec 09 03:28:16 crc kubenswrapper[4766]: E1209 03:28:16.220568 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerName="registry-server" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.220573 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerName="registry-server" Dec 09 03:28:16 crc kubenswrapper[4766]: E1209 03:28:16.220580 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c004078-5148-40dc-9b15-a134ea8fad13" containerName="extract-utilities" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.220585 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c004078-5148-40dc-9b15-a134ea8fad13" containerName="extract-utilities" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.220680 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c004078-5148-40dc-9b15-a134ea8fad13" containerName="registry-server" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.220697 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcfb187-78b1-4f21-bbab-edff562f4033" containerName="registry-server" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.221435 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.223795 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.228858 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc"] Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.314735 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.314811 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz28h\" (UniqueName: \"kubernetes.io/projected/57c50439-8b6a-49d7-8c83-04cf28b100f1-kube-api-access-zz28h\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.314845 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.416383 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.416490 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz28h\" (UniqueName: \"kubernetes.io/projected/57c50439-8b6a-49d7-8c83-04cf28b100f1-kube-api-access-zz28h\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.416548 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.417005 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.417076 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.439377 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz28h\" (UniqueName: \"kubernetes.io/projected/57c50439-8b6a-49d7-8c83-04cf28b100f1-kube-api-access-zz28h\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.537337 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:16 crc kubenswrapper[4766]: I1209 03:28:16.792282 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc"] Dec 09 03:28:16 crc kubenswrapper[4766]: W1209 03:28:16.804487 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57c50439_8b6a_49d7_8c83_04cf28b100f1.slice/crio-dd7f1a8d263163c74d0293032856e841081507562ea3581b21cc010507b9fb57 WatchSource:0}: Error finding container dd7f1a8d263163c74d0293032856e841081507562ea3581b21cc010507b9fb57: Status 404 returned error can't find the container with id dd7f1a8d263163c74d0293032856e841081507562ea3581b21cc010507b9fb57 Dec 09 03:28:17 crc kubenswrapper[4766]: I1209 03:28:17.413187 4766 generic.go:334] "Generic (PLEG): container finished" podID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerID="268816d8e4604cd8c62f9c62210040a657bc2fa3556309eaa4d5fbcf804d1929" exitCode=0 Dec 09 03:28:17 crc kubenswrapper[4766]: I1209 03:28:17.413265 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" event={"ID":"57c50439-8b6a-49d7-8c83-04cf28b100f1","Type":"ContainerDied","Data":"268816d8e4604cd8c62f9c62210040a657bc2fa3556309eaa4d5fbcf804d1929"} Dec 09 03:28:17 crc kubenswrapper[4766]: I1209 03:28:17.413752 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" event={"ID":"57c50439-8b6a-49d7-8c83-04cf28b100f1","Type":"ContainerStarted","Data":"dd7f1a8d263163c74d0293032856e841081507562ea3581b21cc010507b9fb57"} Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.346485 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ss24g" podUID="34120810-df87-4443-a2f7-16982e46027d" containerName="console" containerID="cri-o://3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8" gracePeriod=15 Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.432826 4766 generic.go:334] "Generic (PLEG): container finished" podID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerID="75bf3eadcfc2d9f71974916502f7a0eb4b4b58106ef403504d1ec797b2de8d68" exitCode=0 Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.432886 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" event={"ID":"57c50439-8b6a-49d7-8c83-04cf28b100f1","Type":"ContainerDied","Data":"75bf3eadcfc2d9f71974916502f7a0eb4b4b58106ef403504d1ec797b2de8d68"} Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.771801 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ss24g_34120810-df87-4443-a2f7-16982e46027d/console/0.log" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.772418 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.864882 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf478\" (UniqueName: \"kubernetes.io/projected/34120810-df87-4443-a2f7-16982e46027d-kube-api-access-lf478\") pod \"34120810-df87-4443-a2f7-16982e46027d\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.865008 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-console-config\") pod \"34120810-df87-4443-a2f7-16982e46027d\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.865082 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-oauth-serving-cert\") pod \"34120810-df87-4443-a2f7-16982e46027d\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.865152 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-service-ca\") pod \"34120810-df87-4443-a2f7-16982e46027d\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.865235 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-serving-cert\") pod \"34120810-df87-4443-a2f7-16982e46027d\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.865411 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-oauth-config\") pod \"34120810-df87-4443-a2f7-16982e46027d\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.865483 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-trusted-ca-bundle\") pod \"34120810-df87-4443-a2f7-16982e46027d\" (UID: \"34120810-df87-4443-a2f7-16982e46027d\") " Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.866280 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "34120810-df87-4443-a2f7-16982e46027d" (UID: "34120810-df87-4443-a2f7-16982e46027d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.868436 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-service-ca" (OuterVolumeSpecName: "service-ca") pod "34120810-df87-4443-a2f7-16982e46027d" (UID: "34120810-df87-4443-a2f7-16982e46027d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.868527 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-console-config" (OuterVolumeSpecName: "console-config") pod "34120810-df87-4443-a2f7-16982e46027d" (UID: "34120810-df87-4443-a2f7-16982e46027d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.868797 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "34120810-df87-4443-a2f7-16982e46027d" (UID: "34120810-df87-4443-a2f7-16982e46027d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.871745 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "34120810-df87-4443-a2f7-16982e46027d" (UID: "34120810-df87-4443-a2f7-16982e46027d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.871928 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "34120810-df87-4443-a2f7-16982e46027d" (UID: "34120810-df87-4443-a2f7-16982e46027d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.872341 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34120810-df87-4443-a2f7-16982e46027d-kube-api-access-lf478" (OuterVolumeSpecName: "kube-api-access-lf478") pod "34120810-df87-4443-a2f7-16982e46027d" (UID: "34120810-df87-4443-a2f7-16982e46027d"). InnerVolumeSpecName "kube-api-access-lf478". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.967256 4766 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.967305 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.967315 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf478\" (UniqueName: \"kubernetes.io/projected/34120810-df87-4443-a2f7-16982e46027d-kube-api-access-lf478\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.967325 4766 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.967333 4766 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.967340 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34120810-df87-4443-a2f7-16982e46027d-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:19 crc kubenswrapper[4766]: I1209 03:28:19.967348 4766 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34120810-df87-4443-a2f7-16982e46027d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.441694 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ss24g_34120810-df87-4443-a2f7-16982e46027d/console/0.log" Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.443298 4766 generic.go:334] "Generic (PLEG): container finished" podID="34120810-df87-4443-a2f7-16982e46027d" containerID="3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8" exitCode=2 Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.443384 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ss24g" Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.443375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ss24g" event={"ID":"34120810-df87-4443-a2f7-16982e46027d","Type":"ContainerDied","Data":"3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8"} Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.443869 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ss24g" event={"ID":"34120810-df87-4443-a2f7-16982e46027d","Type":"ContainerDied","Data":"6dc7a56cf53232cb0fd90304c0e7e995dbc2f617f57ef780c5809ec194e2776a"} Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.443903 4766 scope.go:117] "RemoveContainer" containerID="3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8" Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.450469 4766 generic.go:334] "Generic (PLEG): container finished" podID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerID="9df6f74340dc669f4fb1ad0585d9448a08b7a2aefda9e847268c0484aab8db42" exitCode=0 Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.450515 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" event={"ID":"57c50439-8b6a-49d7-8c83-04cf28b100f1","Type":"ContainerDied","Data":"9df6f74340dc669f4fb1ad0585d9448a08b7a2aefda9e847268c0484aab8db42"} Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.476061 4766 scope.go:117] "RemoveContainer" containerID="3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8" Dec 09 03:28:20 crc kubenswrapper[4766]: E1209 03:28:20.476725 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8\": container with ID starting with 3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8 not found: ID does not exist" containerID="3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8" Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.476809 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8"} err="failed to get container status \"3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8\": rpc error: code = NotFound desc = could not find container \"3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8\": container with ID starting with 3a3d69b3d7357ac7b182c4935802487235e790ff5fcff207ecddb4fbf3c327a8 not found: ID does not exist" Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.510901 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ss24g"] Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.517487 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ss24g"] Dec 09 03:28:20 crc kubenswrapper[4766]: I1209 03:28:20.851832 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34120810-df87-4443-a2f7-16982e46027d" path="/var/lib/kubelet/pods/34120810-df87-4443-a2f7-16982e46027d/volumes" Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.720295 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.794414 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-util\") pod \"57c50439-8b6a-49d7-8c83-04cf28b100f1\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.794555 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-bundle\") pod \"57c50439-8b6a-49d7-8c83-04cf28b100f1\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.794584 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz28h\" (UniqueName: \"kubernetes.io/projected/57c50439-8b6a-49d7-8c83-04cf28b100f1-kube-api-access-zz28h\") pod \"57c50439-8b6a-49d7-8c83-04cf28b100f1\" (UID: \"57c50439-8b6a-49d7-8c83-04cf28b100f1\") " Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.795421 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-bundle" (OuterVolumeSpecName: "bundle") pod "57c50439-8b6a-49d7-8c83-04cf28b100f1" (UID: "57c50439-8b6a-49d7-8c83-04cf28b100f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.800019 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c50439-8b6a-49d7-8c83-04cf28b100f1-kube-api-access-zz28h" (OuterVolumeSpecName: "kube-api-access-zz28h") pod "57c50439-8b6a-49d7-8c83-04cf28b100f1" (UID: "57c50439-8b6a-49d7-8c83-04cf28b100f1"). InnerVolumeSpecName "kube-api-access-zz28h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.826435 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-util" (OuterVolumeSpecName: "util") pod "57c50439-8b6a-49d7-8c83-04cf28b100f1" (UID: "57c50439-8b6a-49d7-8c83-04cf28b100f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.896928 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.896985 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz28h\" (UniqueName: \"kubernetes.io/projected/57c50439-8b6a-49d7-8c83-04cf28b100f1-kube-api-access-zz28h\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:21 crc kubenswrapper[4766]: I1209 03:28:21.897006 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57c50439-8b6a-49d7-8c83-04cf28b100f1-util\") on node \"crc\" DevicePath \"\"" Dec 09 03:28:22 crc kubenswrapper[4766]: I1209 03:28:22.471696 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" event={"ID":"57c50439-8b6a-49d7-8c83-04cf28b100f1","Type":"ContainerDied","Data":"dd7f1a8d263163c74d0293032856e841081507562ea3581b21cc010507b9fb57"} Dec 09 03:28:22 crc kubenswrapper[4766]: I1209 03:28:22.471753 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7f1a8d263163c74d0293032856e841081507562ea3581b21cc010507b9fb57" Dec 09 03:28:22 crc kubenswrapper[4766]: I1209 03:28:22.471828 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.096724 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz"] Dec 09 03:28:31 crc kubenswrapper[4766]: E1209 03:28:31.097493 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerName="util" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.097510 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerName="util" Dec 09 03:28:31 crc kubenswrapper[4766]: E1209 03:28:31.097523 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34120810-df87-4443-a2f7-16982e46027d" containerName="console" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.097530 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="34120810-df87-4443-a2f7-16982e46027d" containerName="console" Dec 09 03:28:31 crc kubenswrapper[4766]: E1209 03:28:31.097543 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerName="extract" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.097552 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerName="extract" Dec 09 03:28:31 crc kubenswrapper[4766]: E1209 03:28:31.097574 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerName="pull" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.097581 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerName="pull" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.097696 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="57c50439-8b6a-49d7-8c83-04cf28b100f1" containerName="extract" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.097708 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="34120810-df87-4443-a2f7-16982e46027d" containerName="console" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.098138 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.102231 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.102530 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.102801 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.102890 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.103017 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lzp2t" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.120837 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz"] Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.213652 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c75c2686-ab27-4d76-8164-72f00b640297-webhook-cert\") pod \"metallb-operator-controller-manager-79d558dc88-8rvbz\" (UID: \"c75c2686-ab27-4d76-8164-72f00b640297\") " pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.213713 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c75c2686-ab27-4d76-8164-72f00b640297-apiservice-cert\") pod \"metallb-operator-controller-manager-79d558dc88-8rvbz\" (UID: \"c75c2686-ab27-4d76-8164-72f00b640297\") " pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.213739 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sxsd\" (UniqueName: \"kubernetes.io/projected/c75c2686-ab27-4d76-8164-72f00b640297-kube-api-access-4sxsd\") pod \"metallb-operator-controller-manager-79d558dc88-8rvbz\" (UID: \"c75c2686-ab27-4d76-8164-72f00b640297\") " pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.315229 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c75c2686-ab27-4d76-8164-72f00b640297-apiservice-cert\") pod \"metallb-operator-controller-manager-79d558dc88-8rvbz\" (UID: \"c75c2686-ab27-4d76-8164-72f00b640297\") " pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.315282 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sxsd\" (UniqueName: \"kubernetes.io/projected/c75c2686-ab27-4d76-8164-72f00b640297-kube-api-access-4sxsd\") pod \"metallb-operator-controller-manager-79d558dc88-8rvbz\" (UID: \"c75c2686-ab27-4d76-8164-72f00b640297\") " pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.315367 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c75c2686-ab27-4d76-8164-72f00b640297-webhook-cert\") pod \"metallb-operator-controller-manager-79d558dc88-8rvbz\" (UID: \"c75c2686-ab27-4d76-8164-72f00b640297\") " pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.321539 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c75c2686-ab27-4d76-8164-72f00b640297-webhook-cert\") pod \"metallb-operator-controller-manager-79d558dc88-8rvbz\" (UID: \"c75c2686-ab27-4d76-8164-72f00b640297\") " pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.324233 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c75c2686-ab27-4d76-8164-72f00b640297-apiservice-cert\") pod \"metallb-operator-controller-manager-79d558dc88-8rvbz\" (UID: \"c75c2686-ab27-4d76-8164-72f00b640297\") " pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.335916 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sxsd\" (UniqueName: \"kubernetes.io/projected/c75c2686-ab27-4d76-8164-72f00b640297-kube-api-access-4sxsd\") pod \"metallb-operator-controller-manager-79d558dc88-8rvbz\" (UID: \"c75c2686-ab27-4d76-8164-72f00b640297\") " pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.412858 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz"] Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.413486 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.413809 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.415733 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.416614 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdvmq\" (UniqueName: \"kubernetes.io/projected/8a754727-b501-415f-9466-8dd5e4600864-kube-api-access-tdvmq\") pod \"metallb-operator-webhook-server-66568d58f-2f5kz\" (UID: \"8a754727-b501-415f-9466-8dd5e4600864\") " pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.416684 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a754727-b501-415f-9466-8dd5e4600864-webhook-cert\") pod \"metallb-operator-webhook-server-66568d58f-2f5kz\" (UID: \"8a754727-b501-415f-9466-8dd5e4600864\") " pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.416684 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pc8tz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.416761 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a754727-b501-415f-9466-8dd5e4600864-apiservice-cert\") pod \"metallb-operator-webhook-server-66568d58f-2f5kz\" (UID: \"8a754727-b501-415f-9466-8dd5e4600864\") " pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.417697 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.432483 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz"] Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.517515 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdvmq\" (UniqueName: \"kubernetes.io/projected/8a754727-b501-415f-9466-8dd5e4600864-kube-api-access-tdvmq\") pod \"metallb-operator-webhook-server-66568d58f-2f5kz\" (UID: \"8a754727-b501-415f-9466-8dd5e4600864\") " pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.517575 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a754727-b501-415f-9466-8dd5e4600864-webhook-cert\") pod \"metallb-operator-webhook-server-66568d58f-2f5kz\" (UID: \"8a754727-b501-415f-9466-8dd5e4600864\") " pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.517606 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a754727-b501-415f-9466-8dd5e4600864-apiservice-cert\") pod \"metallb-operator-webhook-server-66568d58f-2f5kz\" (UID: \"8a754727-b501-415f-9466-8dd5e4600864\") " pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.521036 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a754727-b501-415f-9466-8dd5e4600864-webhook-cert\") pod \"metallb-operator-webhook-server-66568d58f-2f5kz\" (UID: \"8a754727-b501-415f-9466-8dd5e4600864\") " pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.538430 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdvmq\" (UniqueName: \"kubernetes.io/projected/8a754727-b501-415f-9466-8dd5e4600864-kube-api-access-tdvmq\") pod \"metallb-operator-webhook-server-66568d58f-2f5kz\" (UID: \"8a754727-b501-415f-9466-8dd5e4600864\") " pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.539742 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a754727-b501-415f-9466-8dd5e4600864-apiservice-cert\") pod \"metallb-operator-webhook-server-66568d58f-2f5kz\" (UID: \"8a754727-b501-415f-9466-8dd5e4600864\") " pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.700879 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz"] Dec 09 03:28:31 crc kubenswrapper[4766]: W1209 03:28:31.707655 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75c2686_ab27_4d76_8164_72f00b640297.slice/crio-c8123a5c94daa308e650115960287ea79365240e856be16b117aa540e9b4af41 WatchSource:0}: Error finding container c8123a5c94daa308e650115960287ea79365240e856be16b117aa540e9b4af41: Status 404 returned error can't find the container with id c8123a5c94daa308e650115960287ea79365240e856be16b117aa540e9b4af41 Dec 09 03:28:31 crc kubenswrapper[4766]: I1209 03:28:31.784737 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:32 crc kubenswrapper[4766]: I1209 03:28:32.093158 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz"] Dec 09 03:28:32 crc kubenswrapper[4766]: W1209 03:28:32.101622 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a754727_b501_415f_9466_8dd5e4600864.slice/crio-8bf89ea37a3063110605936178f4f6488f920dee1c13bb12ef2e124ab8dcc4e4 WatchSource:0}: Error finding container 8bf89ea37a3063110605936178f4f6488f920dee1c13bb12ef2e124ab8dcc4e4: Status 404 returned error can't find the container with id 8bf89ea37a3063110605936178f4f6488f920dee1c13bb12ef2e124ab8dcc4e4 Dec 09 03:28:32 crc kubenswrapper[4766]: I1209 03:28:32.530560 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" event={"ID":"8a754727-b501-415f-9466-8dd5e4600864","Type":"ContainerStarted","Data":"8bf89ea37a3063110605936178f4f6488f920dee1c13bb12ef2e124ab8dcc4e4"} Dec 09 03:28:32 crc kubenswrapper[4766]: I1209 03:28:32.531680 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" event={"ID":"c75c2686-ab27-4d76-8164-72f00b640297","Type":"ContainerStarted","Data":"c8123a5c94daa308e650115960287ea79365240e856be16b117aa540e9b4af41"} Dec 09 03:28:37 crc kubenswrapper[4766]: I1209 03:28:37.567199 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" event={"ID":"8a754727-b501-415f-9466-8dd5e4600864","Type":"ContainerStarted","Data":"bef5f53162601c9bf2159c3613bfc8ed9866ae80de620da5f398817f87edeb0b"} Dec 09 03:28:37 crc kubenswrapper[4766]: I1209 03:28:37.567713 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:28:37 crc kubenswrapper[4766]: I1209 03:28:37.568314 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" event={"ID":"c75c2686-ab27-4d76-8164-72f00b640297","Type":"ContainerStarted","Data":"d053dbb3b00ab80872351d69f1a6953eff4fdd6a20a0beaec2c57bb9b2bfdab2"} Dec 09 03:28:37 crc kubenswrapper[4766]: I1209 03:28:37.568715 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:28:37 crc kubenswrapper[4766]: I1209 03:28:37.594943 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" podStartSLOduration=2.056585507 podStartE2EDuration="6.59492058s" podCreationTimestamp="2025-12-09 03:28:31 +0000 UTC" firstStartedPulling="2025-12-09 03:28:32.104106613 +0000 UTC m=+993.813412039" lastFinishedPulling="2025-12-09 03:28:36.642441686 +0000 UTC m=+998.351747112" observedRunningTime="2025-12-09 03:28:37.588992631 +0000 UTC m=+999.298298057" watchObservedRunningTime="2025-12-09 03:28:37.59492058 +0000 UTC m=+999.304226006" Dec 09 03:28:37 crc kubenswrapper[4766]: I1209 03:28:37.634746 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" podStartSLOduration=1.722934924 podStartE2EDuration="6.63472275s" podCreationTimestamp="2025-12-09 03:28:31 +0000 UTC" firstStartedPulling="2025-12-09 03:28:31.712911523 +0000 UTC m=+993.422216949" lastFinishedPulling="2025-12-09 03:28:36.624699349 +0000 UTC m=+998.334004775" observedRunningTime="2025-12-09 03:28:37.617123547 +0000 UTC m=+999.326428983" watchObservedRunningTime="2025-12-09 03:28:37.63472275 +0000 UTC m=+999.344028176" Dec 09 03:28:51 crc kubenswrapper[4766]: I1209 03:28:51.792770 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66568d58f-2f5kz" Dec 09 03:29:11 crc kubenswrapper[4766]: I1209 03:29:11.416058 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-79d558dc88-8rvbz" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.305872 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-t7t9l"] Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.308750 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.309578 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m"] Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.310282 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.310747 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9gdhz" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.310944 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.318792 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m"] Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.319172 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.319422 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.386606 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-ftjpq"] Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.387709 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.390167 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.390260 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.391737 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-w47ph" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.391857 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.401597 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-metrics\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.401645 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-frr-sockets\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.401665 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e61cbd09-5f69-4f62-b405-913a0fa68111-frr-startup\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.401684 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70002-33af-46fd-9fab-162fb47d4b22-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r699m\" (UID: \"0ee70002-33af-46fd-9fab-162fb47d4b22\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.401707 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-frr-conf\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.401741 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e61cbd09-5f69-4f62-b405-913a0fa68111-metrics-certs\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.401762 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmd5\" (UniqueName: \"kubernetes.io/projected/e61cbd09-5f69-4f62-b405-913a0fa68111-kube-api-access-rrmd5\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.401778 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-reloader\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.401793 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzd6m\" (UniqueName: \"kubernetes.io/projected/0ee70002-33af-46fd-9fab-162fb47d4b22-kube-api-access-vzd6m\") pod \"frr-k8s-webhook-server-7fcb986d4-r699m\" (UID: \"0ee70002-33af-46fd-9fab-162fb47d4b22\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.431016 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-n58vw"] Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.431878 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.433471 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.452835 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-n58vw"] Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.502660 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-memberlist\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.502707 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-cert\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.502739 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh68f\" (UniqueName: \"kubernetes.io/projected/cdf78232-a230-4c4e-a41e-fd13446f16c1-kube-api-access-sh68f\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.502769 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-metrics\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.502923 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-metrics-certs\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.502979 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cdf78232-a230-4c4e-a41e-fd13446f16c1-metallb-excludel2\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503051 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-frr-sockets\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503101 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e61cbd09-5f69-4f62-b405-913a0fa68111-frr-startup\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503123 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70002-33af-46fd-9fab-162fb47d4b22-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r699m\" (UID: \"0ee70002-33af-46fd-9fab-162fb47d4b22\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503158 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-metrics-certs\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503190 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz8cd\" (UniqueName: \"kubernetes.io/projected/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-kube-api-access-xz8cd\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503252 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-frr-conf\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503343 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e61cbd09-5f69-4f62-b405-913a0fa68111-metrics-certs\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503396 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmd5\" (UniqueName: \"kubernetes.io/projected/e61cbd09-5f69-4f62-b405-913a0fa68111-kube-api-access-rrmd5\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503427 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-reloader\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503458 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzd6m\" (UniqueName: \"kubernetes.io/projected/0ee70002-33af-46fd-9fab-162fb47d4b22-kube-api-access-vzd6m\") pod \"frr-k8s-webhook-server-7fcb986d4-r699m\" (UID: \"0ee70002-33af-46fd-9fab-162fb47d4b22\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.503679 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-metrics\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.504056 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-frr-sockets\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.504498 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-reloader\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.504753 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e61cbd09-5f69-4f62-b405-913a0fa68111-frr-conf\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: E1209 03:29:12.504832 4766 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 09 03:29:12 crc kubenswrapper[4766]: E1209 03:29:12.504876 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee70002-33af-46fd-9fab-162fb47d4b22-cert podName:0ee70002-33af-46fd-9fab-162fb47d4b22 nodeName:}" failed. No retries permitted until 2025-12-09 03:29:13.004859279 +0000 UTC m=+1034.714164825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee70002-33af-46fd-9fab-162fb47d4b22-cert") pod "frr-k8s-webhook-server-7fcb986d4-r699m" (UID: "0ee70002-33af-46fd-9fab-162fb47d4b22") : secret "frr-k8s-webhook-server-cert" not found Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.504903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e61cbd09-5f69-4f62-b405-913a0fa68111-frr-startup\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.509610 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e61cbd09-5f69-4f62-b405-913a0fa68111-metrics-certs\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.520867 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmd5\" (UniqueName: \"kubernetes.io/projected/e61cbd09-5f69-4f62-b405-913a0fa68111-kube-api-access-rrmd5\") pod \"frr-k8s-t7t9l\" (UID: \"e61cbd09-5f69-4f62-b405-913a0fa68111\") " pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.521277 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzd6m\" (UniqueName: \"kubernetes.io/projected/0ee70002-33af-46fd-9fab-162fb47d4b22-kube-api-access-vzd6m\") pod \"frr-k8s-webhook-server-7fcb986d4-r699m\" (UID: \"0ee70002-33af-46fd-9fab-162fb47d4b22\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.604605 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-cert\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.604873 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh68f\" (UniqueName: \"kubernetes.io/projected/cdf78232-a230-4c4e-a41e-fd13446f16c1-kube-api-access-sh68f\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.604904 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-metrics-certs\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.604920 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cdf78232-a230-4c4e-a41e-fd13446f16c1-metallb-excludel2\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.604959 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-metrics-certs\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.604978 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz8cd\" (UniqueName: \"kubernetes.io/projected/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-kube-api-access-xz8cd\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.605026 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-memberlist\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: E1209 03:29:12.605035 4766 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 09 03:29:12 crc kubenswrapper[4766]: E1209 03:29:12.605088 4766 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 09 03:29:12 crc kubenswrapper[4766]: E1209 03:29:12.605108 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-metrics-certs podName:8b9c8d01-0ad9-4730-b328-9d67fb4322ba nodeName:}" failed. No retries permitted until 2025-12-09 03:29:13.105088774 +0000 UTC m=+1034.814394270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-metrics-certs") pod "controller-f8648f98b-n58vw" (UID: "8b9c8d01-0ad9-4730-b328-9d67fb4322ba") : secret "controller-certs-secret" not found Dec 09 03:29:12 crc kubenswrapper[4766]: E1209 03:29:12.605126 4766 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 03:29:12 crc kubenswrapper[4766]: E1209 03:29:12.605132 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-metrics-certs podName:cdf78232-a230-4c4e-a41e-fd13446f16c1 nodeName:}" failed. No retries permitted until 2025-12-09 03:29:13.105124335 +0000 UTC m=+1034.814429761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-metrics-certs") pod "speaker-ftjpq" (UID: "cdf78232-a230-4c4e-a41e-fd13446f16c1") : secret "speaker-certs-secret" not found Dec 09 03:29:12 crc kubenswrapper[4766]: E1209 03:29:12.605180 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-memberlist podName:cdf78232-a230-4c4e-a41e-fd13446f16c1 nodeName:}" failed. No retries permitted until 2025-12-09 03:29:13.105166066 +0000 UTC m=+1034.814471492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-memberlist") pod "speaker-ftjpq" (UID: "cdf78232-a230-4c4e-a41e-fd13446f16c1") : secret "metallb-memberlist" not found Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.605827 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cdf78232-a230-4c4e-a41e-fd13446f16c1-metallb-excludel2\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.606680 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.624469 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh68f\" (UniqueName: \"kubernetes.io/projected/cdf78232-a230-4c4e-a41e-fd13446f16c1-kube-api-access-sh68f\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.625601 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz8cd\" (UniqueName: \"kubernetes.io/projected/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-kube-api-access-xz8cd\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.637778 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:12 crc kubenswrapper[4766]: I1209 03:29:12.641068 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-cert\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.009992 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70002-33af-46fd-9fab-162fb47d4b22-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r699m\" (UID: \"0ee70002-33af-46fd-9fab-162fb47d4b22\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.016041 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee70002-33af-46fd-9fab-162fb47d4b22-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-r699m\" (UID: \"0ee70002-33af-46fd-9fab-162fb47d4b22\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.110845 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-memberlist\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:13 crc kubenswrapper[4766]: E1209 03:29:13.110960 4766 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.110973 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-metrics-certs\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:13 crc kubenswrapper[4766]: E1209 03:29:13.111021 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-memberlist podName:cdf78232-a230-4c4e-a41e-fd13446f16c1 nodeName:}" failed. No retries permitted until 2025-12-09 03:29:14.111002698 +0000 UTC m=+1035.820308134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-memberlist") pod "speaker-ftjpq" (UID: "cdf78232-a230-4c4e-a41e-fd13446f16c1") : secret "metallb-memberlist" not found Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.111041 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-metrics-certs\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.114288 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b9c8d01-0ad9-4730-b328-9d67fb4322ba-metrics-certs\") pod \"controller-f8648f98b-n58vw\" (UID: \"8b9c8d01-0ad9-4730-b328-9d67fb4322ba\") " pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.115461 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-metrics-certs\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.244511 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.346967 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.471268 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m"] Dec 09 03:29:13 crc kubenswrapper[4766]: W1209 03:29:13.488317 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ee70002_33af_46fd_9fab_162fb47d4b22.slice/crio-773f259053f2a7b86d63d9393454987370700f0205a8a825eae6493ad3820ef2 WatchSource:0}: Error finding container 773f259053f2a7b86d63d9393454987370700f0205a8a825eae6493ad3820ef2: Status 404 returned error can't find the container with id 773f259053f2a7b86d63d9393454987370700f0205a8a825eae6493ad3820ef2 Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.751571 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-n58vw"] Dec 09 03:29:13 crc kubenswrapper[4766]: W1209 03:29:13.755158 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b9c8d01_0ad9_4730_b328_9d67fb4322ba.slice/crio-c9c8299ea6206f15f1e06e14087f902a16aa46b4ede8e47dee91088474d6b856 WatchSource:0}: Error finding container c9c8299ea6206f15f1e06e14087f902a16aa46b4ede8e47dee91088474d6b856: Status 404 returned error can't find the container with id c9c8299ea6206f15f1e06e14087f902a16aa46b4ede8e47dee91088474d6b856 Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.785744 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" event={"ID":"0ee70002-33af-46fd-9fab-162fb47d4b22","Type":"ContainerStarted","Data":"773f259053f2a7b86d63d9393454987370700f0205a8a825eae6493ad3820ef2"} Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.786824 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-n58vw" event={"ID":"8b9c8d01-0ad9-4730-b328-9d67fb4322ba","Type":"ContainerStarted","Data":"c9c8299ea6206f15f1e06e14087f902a16aa46b4ede8e47dee91088474d6b856"} Dec 09 03:29:13 crc kubenswrapper[4766]: I1209 03:29:13.787901 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerStarted","Data":"6c1f0868a72c7f7b6157387d106b8669f5668435ed7be5140b146cb34ad1d814"} Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.131854 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-memberlist\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.140239 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cdf78232-a230-4c4e-a41e-fd13446f16c1-memberlist\") pod \"speaker-ftjpq\" (UID: \"cdf78232-a230-4c4e-a41e-fd13446f16c1\") " pod="metallb-system/speaker-ftjpq" Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.199937 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ftjpq" Dec 09 03:29:14 crc kubenswrapper[4766]: W1209 03:29:14.217047 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdf78232_a230_4c4e_a41e_fd13446f16c1.slice/crio-9840540e8caeffdecfd9bfe92d25f6b8fe5ab34b0cf310916af76226b368d5b0 WatchSource:0}: Error finding container 9840540e8caeffdecfd9bfe92d25f6b8fe5ab34b0cf310916af76226b368d5b0: Status 404 returned error can't find the container with id 9840540e8caeffdecfd9bfe92d25f6b8fe5ab34b0cf310916af76226b368d5b0 Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.798728 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-n58vw" event={"ID":"8b9c8d01-0ad9-4730-b328-9d67fb4322ba","Type":"ContainerStarted","Data":"1a4e60592b11d29ac9e7820d9b462e004961a79ae27d8487bb4bd5f82cc83bae"} Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.798766 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-n58vw" event={"ID":"8b9c8d01-0ad9-4730-b328-9d67fb4322ba","Type":"ContainerStarted","Data":"afd5e4b5f7ccd5f487a49a528a810820af6eeb4cb44c43486a3f8e418df672cc"} Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.798904 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.800990 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ftjpq" event={"ID":"cdf78232-a230-4c4e-a41e-fd13446f16c1","Type":"ContainerStarted","Data":"8642159f3b7f964102e4092f776ebb6819a693a40d03c6a061cc8cc6228ce58a"} Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.801037 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ftjpq" event={"ID":"cdf78232-a230-4c4e-a41e-fd13446f16c1","Type":"ContainerStarted","Data":"37bf73ad4a7fa7c4a86bf9ca093fe850b4bc5c9f6212b137c5c49efe2453586b"} Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.801051 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ftjpq" event={"ID":"cdf78232-a230-4c4e-a41e-fd13446f16c1","Type":"ContainerStarted","Data":"9840540e8caeffdecfd9bfe92d25f6b8fe5ab34b0cf310916af76226b368d5b0"} Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.801415 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-ftjpq" Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.818541 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-n58vw" podStartSLOduration=2.818523957 podStartE2EDuration="2.818523957s" podCreationTimestamp="2025-12-09 03:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:29:14.815448564 +0000 UTC m=+1036.524753990" watchObservedRunningTime="2025-12-09 03:29:14.818523957 +0000 UTC m=+1036.527829383" Dec 09 03:29:14 crc kubenswrapper[4766]: I1209 03:29:14.835421 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-ftjpq" podStartSLOduration=2.835402721 podStartE2EDuration="2.835402721s" podCreationTimestamp="2025-12-09 03:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:29:14.833429047 +0000 UTC m=+1036.542734473" watchObservedRunningTime="2025-12-09 03:29:14.835402721 +0000 UTC m=+1036.544708147" Dec 09 03:29:19 crc kubenswrapper[4766]: I1209 03:29:19.847950 4766 generic.go:334] "Generic (PLEG): container finished" podID="e61cbd09-5f69-4f62-b405-913a0fa68111" containerID="8f838bd57bfacb348d584abdd86d7f2d0a1afd4400edcf16af115298df504bb7" exitCode=0 Dec 09 03:29:19 crc kubenswrapper[4766]: I1209 03:29:19.848034 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerDied","Data":"8f838bd57bfacb348d584abdd86d7f2d0a1afd4400edcf16af115298df504bb7"} Dec 09 03:29:19 crc kubenswrapper[4766]: I1209 03:29:19.850380 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" event={"ID":"0ee70002-33af-46fd-9fab-162fb47d4b22","Type":"ContainerStarted","Data":"5f3cf04e7589dbbc84a2d35dcf14c3c1b0a06c5368ab7e66dafcfd5921acba49"} Dec 09 03:29:19 crc kubenswrapper[4766]: I1209 03:29:19.850533 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:19 crc kubenswrapper[4766]: I1209 03:29:19.928435 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" podStartSLOduration=2.099662885 podStartE2EDuration="7.928414409s" podCreationTimestamp="2025-12-09 03:29:12 +0000 UTC" firstStartedPulling="2025-12-09 03:29:13.494495591 +0000 UTC m=+1035.203801017" lastFinishedPulling="2025-12-09 03:29:19.323247105 +0000 UTC m=+1041.032552541" observedRunningTime="2025-12-09 03:29:19.923815565 +0000 UTC m=+1041.633121011" watchObservedRunningTime="2025-12-09 03:29:19.928414409 +0000 UTC m=+1041.637719835" Dec 09 03:29:20 crc kubenswrapper[4766]: I1209 03:29:20.859916 4766 generic.go:334] "Generic (PLEG): container finished" podID="e61cbd09-5f69-4f62-b405-913a0fa68111" containerID="29bd936d15b309c48af01e2585210339e473464c45b275a9a9ea1eb616db4b31" exitCode=0 Dec 09 03:29:20 crc kubenswrapper[4766]: I1209 03:29:20.860050 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerDied","Data":"29bd936d15b309c48af01e2585210339e473464c45b275a9a9ea1eb616db4b31"} Dec 09 03:29:21 crc kubenswrapper[4766]: I1209 03:29:21.871141 4766 generic.go:334] "Generic (PLEG): container finished" podID="e61cbd09-5f69-4f62-b405-913a0fa68111" containerID="a6e6f216115086b09d159d925de3d98207033a6f4fd84cf4529ffb46e73cb139" exitCode=0 Dec 09 03:29:21 crc kubenswrapper[4766]: I1209 03:29:21.871192 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerDied","Data":"a6e6f216115086b09d159d925de3d98207033a6f4fd84cf4529ffb46e73cb139"} Dec 09 03:29:22 crc kubenswrapper[4766]: I1209 03:29:22.886386 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerStarted","Data":"16f55727e7d1763c73b20c09160ebbcbb528a5c46c0d969f90e2f136e61ecc0d"} Dec 09 03:29:22 crc kubenswrapper[4766]: I1209 03:29:22.886682 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerStarted","Data":"c5afe5d064e130b5594022ba6890712638c971f1c490d8190a02b01964da998a"} Dec 09 03:29:22 crc kubenswrapper[4766]: I1209 03:29:22.886695 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerStarted","Data":"a01eff967d378dc15a50e7d8118c863681c05de18bfeea2e15aa9861dee37d20"} Dec 09 03:29:22 crc kubenswrapper[4766]: I1209 03:29:22.886705 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerStarted","Data":"1a9b0ec4c0993f70b34e5284ada2bfb9e083bc32fc82c4d1b0cd38f1934bf0fd"} Dec 09 03:29:22 crc kubenswrapper[4766]: I1209 03:29:22.886713 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerStarted","Data":"9068bb0a54c4deb99f3966a712f7e27ad68091977a15921da72b36fcec74cfee"} Dec 09 03:29:23 crc kubenswrapper[4766]: I1209 03:29:23.351116 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-n58vw" Dec 09 03:29:23 crc kubenswrapper[4766]: I1209 03:29:23.900921 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-t7t9l" event={"ID":"e61cbd09-5f69-4f62-b405-913a0fa68111","Type":"ContainerStarted","Data":"643f9e77c3524b2f0e47aea863fabb608c9a81d9efb5e3242a72b9c8a249f899"} Dec 09 03:29:23 crc kubenswrapper[4766]: I1209 03:29:23.901324 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:23 crc kubenswrapper[4766]: I1209 03:29:23.941880 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-t7t9l" podStartSLOduration=5.457309356 podStartE2EDuration="11.941858375s" podCreationTimestamp="2025-12-09 03:29:12 +0000 UTC" firstStartedPulling="2025-12-09 03:29:12.818717789 +0000 UTC m=+1034.528023225" lastFinishedPulling="2025-12-09 03:29:19.303266828 +0000 UTC m=+1041.012572244" observedRunningTime="2025-12-09 03:29:23.935263538 +0000 UTC m=+1045.644569024" watchObservedRunningTime="2025-12-09 03:29:23.941858375 +0000 UTC m=+1045.651163811" Dec 09 03:29:24 crc kubenswrapper[4766]: I1209 03:29:24.206061 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-ftjpq" Dec 09 03:29:25 crc kubenswrapper[4766]: I1209 03:29:25.861860 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2"] Dec 09 03:29:25 crc kubenswrapper[4766]: I1209 03:29:25.863307 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:25 crc kubenswrapper[4766]: I1209 03:29:25.865265 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 03:29:25 crc kubenswrapper[4766]: I1209 03:29:25.872143 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2"] Dec 09 03:29:25 crc kubenswrapper[4766]: I1209 03:29:25.907699 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:25 crc kubenswrapper[4766]: I1209 03:29:25.908067 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:25 crc kubenswrapper[4766]: I1209 03:29:25.908180 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk22r\" (UniqueName: \"kubernetes.io/projected/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-kube-api-access-tk22r\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:26 crc kubenswrapper[4766]: I1209 03:29:26.010171 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk22r\" (UniqueName: \"kubernetes.io/projected/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-kube-api-access-tk22r\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:26 crc kubenswrapper[4766]: I1209 03:29:26.010284 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:26 crc kubenswrapper[4766]: I1209 03:29:26.010315 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:26 crc kubenswrapper[4766]: I1209 03:29:26.010771 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:26 crc kubenswrapper[4766]: I1209 03:29:26.010861 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:26 crc kubenswrapper[4766]: I1209 03:29:26.036996 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk22r\" (UniqueName: \"kubernetes.io/projected/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-kube-api-access-tk22r\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:26 crc kubenswrapper[4766]: I1209 03:29:26.177582 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:26 crc kubenswrapper[4766]: I1209 03:29:26.371534 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2"] Dec 09 03:29:26 crc kubenswrapper[4766]: I1209 03:29:26.923479 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" event={"ID":"94aaaa56-1eca-4031-a1a9-54f8c7ce4049","Type":"ContainerStarted","Data":"d5d84103f3ffcbd654b3446e239284ed0187be25cf28c5fbc7b3fcf1fd855576"} Dec 09 03:29:27 crc kubenswrapper[4766]: I1209 03:29:27.639389 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:27 crc kubenswrapper[4766]: I1209 03:29:27.686689 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:27 crc kubenswrapper[4766]: I1209 03:29:27.931785 4766 generic.go:334] "Generic (PLEG): container finished" podID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerID="22e0b0fa8532893d96f8a422865603ae030028ebf08fed7cae139705f80ee8ab" exitCode=0 Dec 09 03:29:27 crc kubenswrapper[4766]: I1209 03:29:27.931842 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" event={"ID":"94aaaa56-1eca-4031-a1a9-54f8c7ce4049","Type":"ContainerDied","Data":"22e0b0fa8532893d96f8a422865603ae030028ebf08fed7cae139705f80ee8ab"} Dec 09 03:29:30 crc kubenswrapper[4766]: I1209 03:29:30.948755 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" event={"ID":"94aaaa56-1eca-4031-a1a9-54f8c7ce4049","Type":"ContainerStarted","Data":"3daa046485059252f9c6a484843b2e2ec32e79a4d0497a38370782ae2db6878e"} Dec 09 03:29:31 crc kubenswrapper[4766]: I1209 03:29:31.959451 4766 generic.go:334] "Generic (PLEG): container finished" podID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerID="3daa046485059252f9c6a484843b2e2ec32e79a4d0497a38370782ae2db6878e" exitCode=0 Dec 09 03:29:31 crc kubenswrapper[4766]: I1209 03:29:31.959492 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" event={"ID":"94aaaa56-1eca-4031-a1a9-54f8c7ce4049","Type":"ContainerDied","Data":"3daa046485059252f9c6a484843b2e2ec32e79a4d0497a38370782ae2db6878e"} Dec 09 03:29:32 crc kubenswrapper[4766]: I1209 03:29:32.642680 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-t7t9l" Dec 09 03:29:32 crc kubenswrapper[4766]: I1209 03:29:32.965715 4766 generic.go:334] "Generic (PLEG): container finished" podID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerID="d26f87104c228c1fcd9b0413ca01383f341c4db0e5e0c3e04cec50cd5a2d3e8e" exitCode=0 Dec 09 03:29:32 crc kubenswrapper[4766]: I1209 03:29:32.965757 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" event={"ID":"94aaaa56-1eca-4031-a1a9-54f8c7ce4049","Type":"ContainerDied","Data":"d26f87104c228c1fcd9b0413ca01383f341c4db0e5e0c3e04cec50cd5a2d3e8e"} Dec 09 03:29:33 crc kubenswrapper[4766]: I1209 03:29:33.265852 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-r699m" Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.282504 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.331397 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk22r\" (UniqueName: \"kubernetes.io/projected/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-kube-api-access-tk22r\") pod \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.331497 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-bundle\") pod \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.333562 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-util\") pod \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\" (UID: \"94aaaa56-1eca-4031-a1a9-54f8c7ce4049\") " Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.333683 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-bundle" (OuterVolumeSpecName: "bundle") pod "94aaaa56-1eca-4031-a1a9-54f8c7ce4049" (UID: "94aaaa56-1eca-4031-a1a9-54f8c7ce4049"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.334116 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.341539 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-kube-api-access-tk22r" (OuterVolumeSpecName: "kube-api-access-tk22r") pod "94aaaa56-1eca-4031-a1a9-54f8c7ce4049" (UID: "94aaaa56-1eca-4031-a1a9-54f8c7ce4049"). InnerVolumeSpecName "kube-api-access-tk22r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.348394 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-util" (OuterVolumeSpecName: "util") pod "94aaaa56-1eca-4031-a1a9-54f8c7ce4049" (UID: "94aaaa56-1eca-4031-a1a9-54f8c7ce4049"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.435419 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk22r\" (UniqueName: \"kubernetes.io/projected/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-kube-api-access-tk22r\") on node \"crc\" DevicePath \"\"" Dec 09 03:29:34 crc kubenswrapper[4766]: I1209 03:29:34.435483 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94aaaa56-1eca-4031-a1a9-54f8c7ce4049-util\") on node \"crc\" DevicePath \"\"" Dec 09 03:29:35 crc kubenswrapper[4766]: I1209 03:29:35.664864 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" Dec 09 03:29:35 crc kubenswrapper[4766]: I1209 03:29:35.666377 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2" event={"ID":"94aaaa56-1eca-4031-a1a9-54f8c7ce4049","Type":"ContainerDied","Data":"d5d84103f3ffcbd654b3446e239284ed0187be25cf28c5fbc7b3fcf1fd855576"} Dec 09 03:29:35 crc kubenswrapper[4766]: I1209 03:29:35.666497 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5d84103f3ffcbd654b3446e239284ed0187be25cf28c5fbc7b3fcf1fd855576" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.088844 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25"] Dec 09 03:29:38 crc kubenswrapper[4766]: E1209 03:29:38.089534 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerName="util" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.089554 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerName="util" Dec 09 03:29:38 crc kubenswrapper[4766]: E1209 03:29:38.089572 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerName="pull" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.089580 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerName="pull" Dec 09 03:29:38 crc kubenswrapper[4766]: E1209 03:29:38.089595 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerName="extract" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.089604 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerName="extract" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.089761 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="94aaaa56-1eca-4031-a1a9-54f8c7ce4049" containerName="extract" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.090313 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.093624 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.093815 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.094187 4766 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-jhtjb" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.102538 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25"] Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.269726 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-xkk25\" (UID: \"49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.269864 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgtkk\" (UniqueName: \"kubernetes.io/projected/49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093-kube-api-access-zgtkk\") pod \"cert-manager-operator-controller-manager-64cf6dff88-xkk25\" (UID: \"49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.372166 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-xkk25\" (UID: \"49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.372330 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgtkk\" (UniqueName: \"kubernetes.io/projected/49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093-kube-api-access-zgtkk\") pod \"cert-manager-operator-controller-manager-64cf6dff88-xkk25\" (UID: \"49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.372759 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-xkk25\" (UID: \"49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.397570 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgtkk\" (UniqueName: \"kubernetes.io/projected/49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093-kube-api-access-zgtkk\") pod \"cert-manager-operator-controller-manager-64cf6dff88-xkk25\" (UID: \"49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.458300 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" Dec 09 03:29:38 crc kubenswrapper[4766]: I1209 03:29:38.947831 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25"] Dec 09 03:29:38 crc kubenswrapper[4766]: W1209 03:29:38.960088 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a5fc5f_7ad5_4fc2_8e5b_f9d8ea0f7093.slice/crio-8c60f1630b14485405164a603d704a4a319e7736de23792ff598318ee2105ac0 WatchSource:0}: Error finding container 8c60f1630b14485405164a603d704a4a319e7736de23792ff598318ee2105ac0: Status 404 returned error can't find the container with id 8c60f1630b14485405164a603d704a4a319e7736de23792ff598318ee2105ac0 Dec 09 03:29:39 crc kubenswrapper[4766]: I1209 03:29:39.709841 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" event={"ID":"49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093","Type":"ContainerStarted","Data":"8c60f1630b14485405164a603d704a4a319e7736de23792ff598318ee2105ac0"} Dec 09 03:29:46 crc kubenswrapper[4766]: I1209 03:29:46.756603 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" event={"ID":"49a5fc5f-7ad5-4fc2-8e5b-f9d8ea0f7093","Type":"ContainerStarted","Data":"7aad14a1cd95d68ac32d0767b18106089310e8be28fc58daa9183c629db92a57"} Dec 09 03:29:46 crc kubenswrapper[4766]: I1209 03:29:46.785754 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-xkk25" podStartSLOduration=2.013027284 podStartE2EDuration="8.785736562s" podCreationTimestamp="2025-12-09 03:29:38 +0000 UTC" firstStartedPulling="2025-12-09 03:29:38.962027421 +0000 UTC m=+1060.671332847" lastFinishedPulling="2025-12-09 03:29:45.734736689 +0000 UTC m=+1067.444042125" observedRunningTime="2025-12-09 03:29:46.781690953 +0000 UTC m=+1068.490996409" watchObservedRunningTime="2025-12-09 03:29:46.785736562 +0000 UTC m=+1068.495041988" Dec 09 03:29:49 crc kubenswrapper[4766]: I1209 03:29:49.216937 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-g9vlz"] Dec 09 03:29:49 crc kubenswrapper[4766]: I1209 03:29:49.217960 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:29:49 crc kubenswrapper[4766]: W1209 03:29:49.219455 4766 reflector.go:561] object-"cert-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "cert-manager": no relationship found between node 'crc' and this object Dec 09 03:29:49 crc kubenswrapper[4766]: E1209 03:29:49.219498 4766 reflector.go:158] "Unhandled Error" err="object-\"cert-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"cert-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 09 03:29:49 crc kubenswrapper[4766]: W1209 03:29:49.220100 4766 reflector.go:561] object-"cert-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "cert-manager": no relationship found between node 'crc' and this object Dec 09 03:29:49 crc kubenswrapper[4766]: E1209 03:29:49.220168 4766 reflector.go:158] "Unhandled Error" err="object-\"cert-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"cert-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 09 03:29:49 crc kubenswrapper[4766]: W1209 03:29:49.221174 4766 reflector.go:561] object-"cert-manager"/"cert-manager-webhook-dockercfg-84h48": failed to list *v1.Secret: secrets "cert-manager-webhook-dockercfg-84h48" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "cert-manager": no relationship found between node 'crc' and this object Dec 09 03:29:49 crc kubenswrapper[4766]: E1209 03:29:49.221251 4766 reflector.go:158] "Unhandled Error" err="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-84h48\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-manager-webhook-dockercfg-84h48\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"cert-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 09 03:29:49 crc kubenswrapper[4766]: I1209 03:29:49.232613 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-g9vlz"] Dec 09 03:29:49 crc kubenswrapper[4766]: I1209 03:29:49.329103 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80e97801-4f98-4f32-b540-c9edb2ad39a9-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-g9vlz\" (UID: \"80e97801-4f98-4f32-b540-c9edb2ad39a9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:29:49 crc kubenswrapper[4766]: I1209 03:29:49.329159 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6bv7\" (UniqueName: \"kubernetes.io/projected/80e97801-4f98-4f32-b540-c9edb2ad39a9-kube-api-access-f6bv7\") pod \"cert-manager-webhook-f4fb5df64-g9vlz\" (UID: \"80e97801-4f98-4f32-b540-c9edb2ad39a9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:29:49 crc kubenswrapper[4766]: I1209 03:29:49.430131 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80e97801-4f98-4f32-b540-c9edb2ad39a9-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-g9vlz\" (UID: \"80e97801-4f98-4f32-b540-c9edb2ad39a9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:29:49 crc kubenswrapper[4766]: I1209 03:29:49.430196 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6bv7\" (UniqueName: \"kubernetes.io/projected/80e97801-4f98-4f32-b540-c9edb2ad39a9-kube-api-access-f6bv7\") pod \"cert-manager-webhook-f4fb5df64-g9vlz\" (UID: \"80e97801-4f98-4f32-b540-c9edb2ad39a9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:29:49 crc kubenswrapper[4766]: I1209 03:29:49.448404 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80e97801-4f98-4f32-b540-c9edb2ad39a9-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-g9vlz\" (UID: \"80e97801-4f98-4f32-b540-c9edb2ad39a9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:29:50 crc kubenswrapper[4766]: I1209 03:29:50.437148 4766 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-84h48" Dec 09 03:29:50 crc kubenswrapper[4766]: E1209 03:29:50.443895 4766 projected.go:288] Couldn't get configMap cert-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 09 03:29:50 crc kubenswrapper[4766]: I1209 03:29:50.619973 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 09 03:29:50 crc kubenswrapper[4766]: E1209 03:29:50.624047 4766 projected.go:194] Error preparing data for projected volume kube-api-access-f6bv7 for pod cert-manager/cert-manager-webhook-f4fb5df64-g9vlz: failed to sync configmap cache: timed out waiting for the condition Dec 09 03:29:50 crc kubenswrapper[4766]: E1209 03:29:50.624120 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80e97801-4f98-4f32-b540-c9edb2ad39a9-kube-api-access-f6bv7 podName:80e97801-4f98-4f32-b540-c9edb2ad39a9 nodeName:}" failed. No retries permitted until 2025-12-09 03:29:51.124098721 +0000 UTC m=+1072.833404147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f6bv7" (UniqueName: "kubernetes.io/projected/80e97801-4f98-4f32-b540-c9edb2ad39a9-kube-api-access-f6bv7") pod "cert-manager-webhook-f4fb5df64-g9vlz" (UID: "80e97801-4f98-4f32-b540-c9edb2ad39a9") : failed to sync configmap cache: timed out waiting for the condition Dec 09 03:29:50 crc kubenswrapper[4766]: I1209 03:29:50.759950 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 09 03:29:51 crc kubenswrapper[4766]: I1209 03:29:51.154492 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6bv7\" (UniqueName: \"kubernetes.io/projected/80e97801-4f98-4f32-b540-c9edb2ad39a9-kube-api-access-f6bv7\") pod \"cert-manager-webhook-f4fb5df64-g9vlz\" (UID: \"80e97801-4f98-4f32-b540-c9edb2ad39a9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:29:51 crc kubenswrapper[4766]: I1209 03:29:51.157852 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6bv7\" (UniqueName: \"kubernetes.io/projected/80e97801-4f98-4f32-b540-c9edb2ad39a9-kube-api-access-f6bv7\") pod \"cert-manager-webhook-f4fb5df64-g9vlz\" (UID: \"80e97801-4f98-4f32-b540-c9edb2ad39a9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:29:51 crc kubenswrapper[4766]: I1209 03:29:51.339960 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:29:51 crc kubenswrapper[4766]: I1209 03:29:51.673928 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-g9vlz"] Dec 09 03:29:51 crc kubenswrapper[4766]: I1209 03:29:51.783402 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" event={"ID":"80e97801-4f98-4f32-b540-c9edb2ad39a9","Type":"ContainerStarted","Data":"cdaca7e589eeb93b2154118701d55a73c52e7bd4b74b94b7314945cb58c49176"} Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.701483 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb"] Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.702372 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.707657 4766 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zmkmm" Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.714829 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb"] Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.774781 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwkw5\" (UniqueName: \"kubernetes.io/projected/0d683a6a-fcb6-4476-9a2d-e1905c04d0cb-kube-api-access-fwkw5\") pod \"cert-manager-cainjector-855d9ccff4-ljgrb\" (UID: \"0d683a6a-fcb6-4476-9a2d-e1905c04d0cb\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.774830 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d683a6a-fcb6-4476-9a2d-e1905c04d0cb-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-ljgrb\" (UID: \"0d683a6a-fcb6-4476-9a2d-e1905c04d0cb\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.875580 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d683a6a-fcb6-4476-9a2d-e1905c04d0cb-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-ljgrb\" (UID: \"0d683a6a-fcb6-4476-9a2d-e1905c04d0cb\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.875693 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwkw5\" (UniqueName: \"kubernetes.io/projected/0d683a6a-fcb6-4476-9a2d-e1905c04d0cb-kube-api-access-fwkw5\") pod \"cert-manager-cainjector-855d9ccff4-ljgrb\" (UID: \"0d683a6a-fcb6-4476-9a2d-e1905c04d0cb\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.898050 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d683a6a-fcb6-4476-9a2d-e1905c04d0cb-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-ljgrb\" (UID: \"0d683a6a-fcb6-4476-9a2d-e1905c04d0cb\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" Dec 09 03:29:52 crc kubenswrapper[4766]: I1209 03:29:52.898755 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwkw5\" (UniqueName: \"kubernetes.io/projected/0d683a6a-fcb6-4476-9a2d-e1905c04d0cb-kube-api-access-fwkw5\") pod \"cert-manager-cainjector-855d9ccff4-ljgrb\" (UID: \"0d683a6a-fcb6-4476-9a2d-e1905c04d0cb\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" Dec 09 03:29:53 crc kubenswrapper[4766]: I1209 03:29:53.024301 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" Dec 09 03:29:53 crc kubenswrapper[4766]: I1209 03:29:53.476983 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb"] Dec 09 03:29:53 crc kubenswrapper[4766]: I1209 03:29:53.801204 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" event={"ID":"0d683a6a-fcb6-4476-9a2d-e1905c04d0cb","Type":"ContainerStarted","Data":"55c0815194664d4ef3e4a7d8acc3d49a60f1542804d3e65602849b44e1208781"} Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.142757 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd"] Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.144465 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.146403 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.146857 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.151373 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd"] Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.271178 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-secret-volume\") pod \"collect-profiles-29420850-w6fpd\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.271290 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-config-volume\") pod \"collect-profiles-29420850-w6fpd\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.271393 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpj8j\" (UniqueName: \"kubernetes.io/projected/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-kube-api-access-lpj8j\") pod \"collect-profiles-29420850-w6fpd\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.372598 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-secret-volume\") pod \"collect-profiles-29420850-w6fpd\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.372638 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-config-volume\") pod \"collect-profiles-29420850-w6fpd\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.372698 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpj8j\" (UniqueName: \"kubernetes.io/projected/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-kube-api-access-lpj8j\") pod \"collect-profiles-29420850-w6fpd\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.373532 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-config-volume\") pod \"collect-profiles-29420850-w6fpd\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.387141 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-secret-volume\") pod \"collect-profiles-29420850-w6fpd\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.390679 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpj8j\" (UniqueName: \"kubernetes.io/projected/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-kube-api-access-lpj8j\") pod \"collect-profiles-29420850-w6fpd\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.478103 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.679753 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-zb5bt"] Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.680993 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-zb5bt" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.682874 4766 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ds8hp" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.698913 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-zb5bt"] Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.778088 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d82wk\" (UniqueName: \"kubernetes.io/projected/ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b-kube-api-access-d82wk\") pod \"cert-manager-86cb77c54b-zb5bt\" (UID: \"ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b\") " pod="cert-manager/cert-manager-86cb77c54b-zb5bt" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.778160 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b-bound-sa-token\") pod \"cert-manager-86cb77c54b-zb5bt\" (UID: \"ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b\") " pod="cert-manager/cert-manager-86cb77c54b-zb5bt" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.850302 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" event={"ID":"80e97801-4f98-4f32-b540-c9edb2ad39a9","Type":"ContainerStarted","Data":"64f3081570d546c08815ddc2e9eeced90fbc771e3f9a47b068e243ca2a9e4afc"} Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.850395 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.851556 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" event={"ID":"0d683a6a-fcb6-4476-9a2d-e1905c04d0cb","Type":"ContainerStarted","Data":"6f198d95ea86ae96b72305b96d2edf865e9299d05ce64ea4a5744a7cf26d7790"} Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.871658 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" podStartSLOduration=3.156263314 podStartE2EDuration="11.871639613s" podCreationTimestamp="2025-12-09 03:29:49 +0000 UTC" firstStartedPulling="2025-12-09 03:29:51.682144744 +0000 UTC m=+1073.391450170" lastFinishedPulling="2025-12-09 03:30:00.397521053 +0000 UTC m=+1082.106826469" observedRunningTime="2025-12-09 03:30:00.863193066 +0000 UTC m=+1082.572498502" watchObservedRunningTime="2025-12-09 03:30:00.871639613 +0000 UTC m=+1082.580945039" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.885563 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d82wk\" (UniqueName: \"kubernetes.io/projected/ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b-kube-api-access-d82wk\") pod \"cert-manager-86cb77c54b-zb5bt\" (UID: \"ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b\") " pod="cert-manager/cert-manager-86cb77c54b-zb5bt" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.885648 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b-bound-sa-token\") pod \"cert-manager-86cb77c54b-zb5bt\" (UID: \"ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b\") " pod="cert-manager/cert-manager-86cb77c54b-zb5bt" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.891695 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-ljgrb" podStartSLOduration=2.010696682 podStartE2EDuration="8.891672111s" podCreationTimestamp="2025-12-09 03:29:52 +0000 UTC" firstStartedPulling="2025-12-09 03:29:53.483940267 +0000 UTC m=+1075.193245693" lastFinishedPulling="2025-12-09 03:30:00.364915696 +0000 UTC m=+1082.074221122" observedRunningTime="2025-12-09 03:30:00.883683006 +0000 UTC m=+1082.592988782" watchObservedRunningTime="2025-12-09 03:30:00.891672111 +0000 UTC m=+1082.600977537" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.911707 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d82wk\" (UniqueName: \"kubernetes.io/projected/ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b-kube-api-access-d82wk\") pod \"cert-manager-86cb77c54b-zb5bt\" (UID: \"ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b\") " pod="cert-manager/cert-manager-86cb77c54b-zb5bt" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.920012 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b-bound-sa-token\") pod \"cert-manager-86cb77c54b-zb5bt\" (UID: \"ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b\") " pod="cert-manager/cert-manager-86cb77c54b-zb5bt" Dec 09 03:30:00 crc kubenswrapper[4766]: I1209 03:30:00.967231 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd"] Dec 09 03:30:01 crc kubenswrapper[4766]: I1209 03:30:01.004298 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-zb5bt" Dec 09 03:30:01 crc kubenswrapper[4766]: I1209 03:30:01.233725 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-zb5bt"] Dec 09 03:30:01 crc kubenswrapper[4766]: W1209 03:30:01.243060 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff28ddcc_8cd5_4f2e_bb76_0d23db11d40b.slice/crio-a8719add9b1fa8b7fe5cacde8b3916b7e6b3351078080cfb3fd3eaf949eaa95c WatchSource:0}: Error finding container a8719add9b1fa8b7fe5cacde8b3916b7e6b3351078080cfb3fd3eaf949eaa95c: Status 404 returned error can't find the container with id a8719add9b1fa8b7fe5cacde8b3916b7e6b3351078080cfb3fd3eaf949eaa95c Dec 09 03:30:01 crc kubenswrapper[4766]: I1209 03:30:01.858592 4766 generic.go:334] "Generic (PLEG): container finished" podID="0bc3c00f-71ec-4d0e-8e7e-9d3476e35995" containerID="6fd5729ab6e5bde7239983435485a5329d84ed065309b7d8a8c00d11612017b0" exitCode=0 Dec 09 03:30:01 crc kubenswrapper[4766]: I1209 03:30:01.858691 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" event={"ID":"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995","Type":"ContainerDied","Data":"6fd5729ab6e5bde7239983435485a5329d84ed065309b7d8a8c00d11612017b0"} Dec 09 03:30:01 crc kubenswrapper[4766]: I1209 03:30:01.858929 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" event={"ID":"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995","Type":"ContainerStarted","Data":"6021f451f0ef463cb41a49c9777510a003984bb263a3a145b20c43746bd7376c"} Dec 09 03:30:01 crc kubenswrapper[4766]: I1209 03:30:01.860739 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-zb5bt" event={"ID":"ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b","Type":"ContainerStarted","Data":"00bef5f86284fbafada563ace297f96e5d4cf7a82f1d7f05003a38825bc51386"} Dec 09 03:30:01 crc kubenswrapper[4766]: I1209 03:30:01.860795 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-zb5bt" event={"ID":"ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b","Type":"ContainerStarted","Data":"a8719add9b1fa8b7fe5cacde8b3916b7e6b3351078080cfb3fd3eaf949eaa95c"} Dec 09 03:30:01 crc kubenswrapper[4766]: I1209 03:30:01.906112 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-zb5bt" podStartSLOduration=1.90606874 podStartE2EDuration="1.90606874s" podCreationTimestamp="2025-12-09 03:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:30:01.895316182 +0000 UTC m=+1083.604621608" watchObservedRunningTime="2025-12-09 03:30:01.90606874 +0000 UTC m=+1083.615374176" Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.244893 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.417432 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpj8j\" (UniqueName: \"kubernetes.io/projected/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-kube-api-access-lpj8j\") pod \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.417481 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-secret-volume\") pod \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.417555 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-config-volume\") pod \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\" (UID: \"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995\") " Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.418454 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-config-volume" (OuterVolumeSpecName: "config-volume") pod "0bc3c00f-71ec-4d0e-8e7e-9d3476e35995" (UID: "0bc3c00f-71ec-4d0e-8e7e-9d3476e35995"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.422494 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-kube-api-access-lpj8j" (OuterVolumeSpecName: "kube-api-access-lpj8j") pod "0bc3c00f-71ec-4d0e-8e7e-9d3476e35995" (UID: "0bc3c00f-71ec-4d0e-8e7e-9d3476e35995"). InnerVolumeSpecName "kube-api-access-lpj8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.423898 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0bc3c00f-71ec-4d0e-8e7e-9d3476e35995" (UID: "0bc3c00f-71ec-4d0e-8e7e-9d3476e35995"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.519281 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpj8j\" (UniqueName: \"kubernetes.io/projected/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-kube-api-access-lpj8j\") on node \"crc\" DevicePath \"\"" Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.519339 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.519359 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.877514 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" event={"ID":"0bc3c00f-71ec-4d0e-8e7e-9d3476e35995","Type":"ContainerDied","Data":"6021f451f0ef463cb41a49c9777510a003984bb263a3a145b20c43746bd7376c"} Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.877864 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6021f451f0ef463cb41a49c9777510a003984bb263a3a145b20c43746bd7376c" Dec 09 03:30:03 crc kubenswrapper[4766]: I1209 03:30:03.877597 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd" Dec 09 03:30:03 crc kubenswrapper[4766]: E1209 03:30:03.910251 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc3c00f_71ec_4d0e_8e7e_9d3476e35995.slice\": RecentStats: unable to find data in memory cache]" Dec 09 03:30:06 crc kubenswrapper[4766]: I1209 03:30:06.343979 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-g9vlz" Dec 09 03:30:07 crc kubenswrapper[4766]: I1209 03:30:07.316932 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:30:07 crc kubenswrapper[4766]: I1209 03:30:07.317000 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.280945 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rr5t6"] Dec 09 03:30:10 crc kubenswrapper[4766]: E1209 03:30:10.281785 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc3c00f-71ec-4d0e-8e7e-9d3476e35995" containerName="collect-profiles" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.281817 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc3c00f-71ec-4d0e-8e7e-9d3476e35995" containerName="collect-profiles" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.282083 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc3c00f-71ec-4d0e-8e7e-9d3476e35995" containerName="collect-profiles" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.282795 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rr5t6" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.287674 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.291026 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-m7d99" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.291192 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.295278 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rr5t6"] Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.412090 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brrj6\" (UniqueName: \"kubernetes.io/projected/21ba4f8d-9d59-4adc-a797-6a0cadf7dce8-kube-api-access-brrj6\") pod \"openstack-operator-index-rr5t6\" (UID: \"21ba4f8d-9d59-4adc-a797-6a0cadf7dce8\") " pod="openstack-operators/openstack-operator-index-rr5t6" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.513118 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brrj6\" (UniqueName: \"kubernetes.io/projected/21ba4f8d-9d59-4adc-a797-6a0cadf7dce8-kube-api-access-brrj6\") pod \"openstack-operator-index-rr5t6\" (UID: \"21ba4f8d-9d59-4adc-a797-6a0cadf7dce8\") " pod="openstack-operators/openstack-operator-index-rr5t6" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.531123 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brrj6\" (UniqueName: \"kubernetes.io/projected/21ba4f8d-9d59-4adc-a797-6a0cadf7dce8-kube-api-access-brrj6\") pod \"openstack-operator-index-rr5t6\" (UID: \"21ba4f8d-9d59-4adc-a797-6a0cadf7dce8\") " pod="openstack-operators/openstack-operator-index-rr5t6" Dec 09 03:30:10 crc kubenswrapper[4766]: I1209 03:30:10.603540 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rr5t6" Dec 09 03:30:11 crc kubenswrapper[4766]: I1209 03:30:11.004718 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rr5t6"] Dec 09 03:30:11 crc kubenswrapper[4766]: W1209 03:30:11.020494 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21ba4f8d_9d59_4adc_a797_6a0cadf7dce8.slice/crio-dc45707a2f9356d0ba80f7c8056b0b94c6db8a5a57b9f4bd938b0eb02c346dd9 WatchSource:0}: Error finding container dc45707a2f9356d0ba80f7c8056b0b94c6db8a5a57b9f4bd938b0eb02c346dd9: Status 404 returned error can't find the container with id dc45707a2f9356d0ba80f7c8056b0b94c6db8a5a57b9f4bd938b0eb02c346dd9 Dec 09 03:30:11 crc kubenswrapper[4766]: I1209 03:30:11.939088 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rr5t6" event={"ID":"21ba4f8d-9d59-4adc-a797-6a0cadf7dce8","Type":"ContainerStarted","Data":"dc45707a2f9356d0ba80f7c8056b0b94c6db8a5a57b9f4bd938b0eb02c346dd9"} Dec 09 03:30:13 crc kubenswrapper[4766]: I1209 03:30:13.852312 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rr5t6"] Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.466044 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-h2hcg"] Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.467006 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h2hcg" Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.472750 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h2hcg"] Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.568095 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49tjs\" (UniqueName: \"kubernetes.io/projected/fd134238-f047-4601-a2e0-58781bdb521b-kube-api-access-49tjs\") pod \"openstack-operator-index-h2hcg\" (UID: \"fd134238-f047-4601-a2e0-58781bdb521b\") " pod="openstack-operators/openstack-operator-index-h2hcg" Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.669769 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49tjs\" (UniqueName: \"kubernetes.io/projected/fd134238-f047-4601-a2e0-58781bdb521b-kube-api-access-49tjs\") pod \"openstack-operator-index-h2hcg\" (UID: \"fd134238-f047-4601-a2e0-58781bdb521b\") " pod="openstack-operators/openstack-operator-index-h2hcg" Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.709717 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49tjs\" (UniqueName: \"kubernetes.io/projected/fd134238-f047-4601-a2e0-58781bdb521b-kube-api-access-49tjs\") pod \"openstack-operator-index-h2hcg\" (UID: \"fd134238-f047-4601-a2e0-58781bdb521b\") " pod="openstack-operators/openstack-operator-index-h2hcg" Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.790325 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h2hcg" Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.958944 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rr5t6" event={"ID":"21ba4f8d-9d59-4adc-a797-6a0cadf7dce8","Type":"ContainerStarted","Data":"f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795"} Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.959036 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rr5t6" podUID="21ba4f8d-9d59-4adc-a797-6a0cadf7dce8" containerName="registry-server" containerID="cri-o://f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795" gracePeriod=2 Dec 09 03:30:14 crc kubenswrapper[4766]: I1209 03:30:14.976755 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rr5t6" podStartSLOduration=1.687574309 podStartE2EDuration="4.97673787s" podCreationTimestamp="2025-12-09 03:30:10 +0000 UTC" firstStartedPulling="2025-12-09 03:30:11.022906375 +0000 UTC m=+1092.732211801" lastFinishedPulling="2025-12-09 03:30:14.312069916 +0000 UTC m=+1096.021375362" observedRunningTime="2025-12-09 03:30:14.974870399 +0000 UTC m=+1096.684175855" watchObservedRunningTime="2025-12-09 03:30:14.97673787 +0000 UTC m=+1096.686043306" Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.185107 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h2hcg"] Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.293772 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rr5t6" Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.378795 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brrj6\" (UniqueName: \"kubernetes.io/projected/21ba4f8d-9d59-4adc-a797-6a0cadf7dce8-kube-api-access-brrj6\") pod \"21ba4f8d-9d59-4adc-a797-6a0cadf7dce8\" (UID: \"21ba4f8d-9d59-4adc-a797-6a0cadf7dce8\") " Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.389769 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ba4f8d-9d59-4adc-a797-6a0cadf7dce8-kube-api-access-brrj6" (OuterVolumeSpecName: "kube-api-access-brrj6") pod "21ba4f8d-9d59-4adc-a797-6a0cadf7dce8" (UID: "21ba4f8d-9d59-4adc-a797-6a0cadf7dce8"). InnerVolumeSpecName "kube-api-access-brrj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.480244 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brrj6\" (UniqueName: \"kubernetes.io/projected/21ba4f8d-9d59-4adc-a797-6a0cadf7dce8-kube-api-access-brrj6\") on node \"crc\" DevicePath \"\"" Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.970452 4766 generic.go:334] "Generic (PLEG): container finished" podID="21ba4f8d-9d59-4adc-a797-6a0cadf7dce8" containerID="f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795" exitCode=0 Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.970543 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rr5t6" Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.970593 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rr5t6" event={"ID":"21ba4f8d-9d59-4adc-a797-6a0cadf7dce8","Type":"ContainerDied","Data":"f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795"} Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.970649 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rr5t6" event={"ID":"21ba4f8d-9d59-4adc-a797-6a0cadf7dce8","Type":"ContainerDied","Data":"dc45707a2f9356d0ba80f7c8056b0b94c6db8a5a57b9f4bd938b0eb02c346dd9"} Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.970685 4766 scope.go:117] "RemoveContainer" containerID="f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795" Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.973904 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h2hcg" event={"ID":"fd134238-f047-4601-a2e0-58781bdb521b","Type":"ContainerStarted","Data":"9c3f290039b2c90d2b2d8259e2bc22b791fa12a856f3d9d82e7a2601ee76b552"} Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.973974 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h2hcg" event={"ID":"fd134238-f047-4601-a2e0-58781bdb521b","Type":"ContainerStarted","Data":"3490add9224686bea1ac7a78da89f965929584404b6c7a7619abe62ef5f4b6c1"} Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.996984 4766 scope.go:117] "RemoveContainer" containerID="f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795" Dec 09 03:30:15 crc kubenswrapper[4766]: E1209 03:30:15.998267 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795\": container with ID starting with f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795 not found: ID does not exist" containerID="f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795" Dec 09 03:30:15 crc kubenswrapper[4766]: I1209 03:30:15.998308 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795"} err="failed to get container status \"f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795\": rpc error: code = NotFound desc = could not find container \"f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795\": container with ID starting with f088a1d6dba2d3dad3f34e261f40b93af65efb3fb3d89b42ca8748a7c83f7795 not found: ID does not exist" Dec 09 03:30:16 crc kubenswrapper[4766]: I1209 03:30:16.001280 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-h2hcg" podStartSLOduration=1.9535816590000001 podStartE2EDuration="2.001266541s" podCreationTimestamp="2025-12-09 03:30:14 +0000 UTC" firstStartedPulling="2025-12-09 03:30:15.20023 +0000 UTC m=+1096.909535426" lastFinishedPulling="2025-12-09 03:30:15.247914882 +0000 UTC m=+1096.957220308" observedRunningTime="2025-12-09 03:30:15.997124079 +0000 UTC m=+1097.706429545" watchObservedRunningTime="2025-12-09 03:30:16.001266541 +0000 UTC m=+1097.710571987" Dec 09 03:30:16 crc kubenswrapper[4766]: I1209 03:30:16.023791 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rr5t6"] Dec 09 03:30:16 crc kubenswrapper[4766]: I1209 03:30:16.028550 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rr5t6"] Dec 09 03:30:16 crc kubenswrapper[4766]: I1209 03:30:16.850406 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ba4f8d-9d59-4adc-a797-6a0cadf7dce8" path="/var/lib/kubelet/pods/21ba4f8d-9d59-4adc-a797-6a0cadf7dce8/volumes" Dec 09 03:30:24 crc kubenswrapper[4766]: I1209 03:30:24.791000 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-h2hcg" Dec 09 03:30:24 crc kubenswrapper[4766]: I1209 03:30:24.791560 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-h2hcg" Dec 09 03:30:24 crc kubenswrapper[4766]: I1209 03:30:24.829753 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-h2hcg" Dec 09 03:30:25 crc kubenswrapper[4766]: I1209 03:30:25.074298 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-h2hcg" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.108791 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9"] Dec 09 03:30:33 crc kubenswrapper[4766]: E1209 03:30:33.110042 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ba4f8d-9d59-4adc-a797-6a0cadf7dce8" containerName="registry-server" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.110061 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ba4f8d-9d59-4adc-a797-6a0cadf7dce8" containerName="registry-server" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.110194 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ba4f8d-9d59-4adc-a797-6a0cadf7dce8" containerName="registry-server" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.111500 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.114094 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ng5n6" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.135021 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9"] Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.152832 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-bundle\") pod \"e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.152942 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njw9m\" (UniqueName: \"kubernetes.io/projected/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-kube-api-access-njw9m\") pod \"e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.153018 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-util\") pod \"e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.254179 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-bundle\") pod \"e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.254323 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njw9m\" (UniqueName: \"kubernetes.io/projected/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-kube-api-access-njw9m\") pod \"e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.254397 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-util\") pod \"e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.254708 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-util\") pod \"e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.254728 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-bundle\") pod \"e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.280475 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njw9m\" (UniqueName: \"kubernetes.io/projected/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-kube-api-access-njw9m\") pod \"e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.443511 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:33 crc kubenswrapper[4766]: I1209 03:30:33.742937 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9"] Dec 09 03:30:34 crc kubenswrapper[4766]: I1209 03:30:34.100496 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" event={"ID":"dac3aa15-44b2-4d5f-bd77-94928a5e3a62","Type":"ContainerStarted","Data":"1f95284489fd42dd17909a764efc2f5be447033643cb98acf36db2ac268a9652"} Dec 09 03:30:34 crc kubenswrapper[4766]: I1209 03:30:34.100808 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" event={"ID":"dac3aa15-44b2-4d5f-bd77-94928a5e3a62","Type":"ContainerStarted","Data":"8b2a7196445c32a7940af44f32eac9028a5c7e475d84df0898d68554df87c57e"} Dec 09 03:30:35 crc kubenswrapper[4766]: I1209 03:30:35.107016 4766 generic.go:334] "Generic (PLEG): container finished" podID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerID="1f95284489fd42dd17909a764efc2f5be447033643cb98acf36db2ac268a9652" exitCode=0 Dec 09 03:30:35 crc kubenswrapper[4766]: I1209 03:30:35.107062 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" event={"ID":"dac3aa15-44b2-4d5f-bd77-94928a5e3a62","Type":"ContainerDied","Data":"1f95284489fd42dd17909a764efc2f5be447033643cb98acf36db2ac268a9652"} Dec 09 03:30:35 crc kubenswrapper[4766]: I1209 03:30:35.109833 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 03:30:36 crc kubenswrapper[4766]: I1209 03:30:36.116314 4766 generic.go:334] "Generic (PLEG): container finished" podID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerID="d41b85299642a8e5793f6130422304749f96ddcbc776d36c24c3f35afca0418b" exitCode=0 Dec 09 03:30:36 crc kubenswrapper[4766]: I1209 03:30:36.116381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" event={"ID":"dac3aa15-44b2-4d5f-bd77-94928a5e3a62","Type":"ContainerDied","Data":"d41b85299642a8e5793f6130422304749f96ddcbc776d36c24c3f35afca0418b"} Dec 09 03:30:37 crc kubenswrapper[4766]: I1209 03:30:37.124962 4766 generic.go:334] "Generic (PLEG): container finished" podID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerID="d4d2f1df0bba4381ab9355f6d0b15022cd3573763bdd9115fb38b1f55c683ded" exitCode=0 Dec 09 03:30:37 crc kubenswrapper[4766]: I1209 03:30:37.125065 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" event={"ID":"dac3aa15-44b2-4d5f-bd77-94928a5e3a62","Type":"ContainerDied","Data":"d4d2f1df0bba4381ab9355f6d0b15022cd3573763bdd9115fb38b1f55c683ded"} Dec 09 03:30:37 crc kubenswrapper[4766]: I1209 03:30:37.316880 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:30:37 crc kubenswrapper[4766]: I1209 03:30:37.316952 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.468423 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.578477 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-bundle\") pod \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.578624 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-util\") pod \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.578710 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njw9m\" (UniqueName: \"kubernetes.io/projected/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-kube-api-access-njw9m\") pod \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\" (UID: \"dac3aa15-44b2-4d5f-bd77-94928a5e3a62\") " Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.579768 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-bundle" (OuterVolumeSpecName: "bundle") pod "dac3aa15-44b2-4d5f-bd77-94928a5e3a62" (UID: "dac3aa15-44b2-4d5f-bd77-94928a5e3a62"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.588846 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-kube-api-access-njw9m" (OuterVolumeSpecName: "kube-api-access-njw9m") pod "dac3aa15-44b2-4d5f-bd77-94928a5e3a62" (UID: "dac3aa15-44b2-4d5f-bd77-94928a5e3a62"). InnerVolumeSpecName "kube-api-access-njw9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.596733 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-util" (OuterVolumeSpecName: "util") pod "dac3aa15-44b2-4d5f-bd77-94928a5e3a62" (UID: "dac3aa15-44b2-4d5f-bd77-94928a5e3a62"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.680390 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.680427 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-util\") on node \"crc\" DevicePath \"\"" Dec 09 03:30:38 crc kubenswrapper[4766]: I1209 03:30:38.680439 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njw9m\" (UniqueName: \"kubernetes.io/projected/dac3aa15-44b2-4d5f-bd77-94928a5e3a62-kube-api-access-njw9m\") on node \"crc\" DevicePath \"\"" Dec 09 03:30:39 crc kubenswrapper[4766]: I1209 03:30:39.141171 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" event={"ID":"dac3aa15-44b2-4d5f-bd77-94928a5e3a62","Type":"ContainerDied","Data":"8b2a7196445c32a7940af44f32eac9028a5c7e475d84df0898d68554df87c57e"} Dec 09 03:30:39 crc kubenswrapper[4766]: I1209 03:30:39.141254 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b2a7196445c32a7940af44f32eac9028a5c7e475d84df0898d68554df87c57e" Dec 09 03:30:39 crc kubenswrapper[4766]: I1209 03:30:39.141255 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.382542 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp"] Dec 09 03:30:44 crc kubenswrapper[4766]: E1209 03:30:44.383131 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerName="util" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.383445 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerName="util" Dec 09 03:30:44 crc kubenswrapper[4766]: E1209 03:30:44.383477 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerName="pull" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.383486 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerName="pull" Dec 09 03:30:44 crc kubenswrapper[4766]: E1209 03:30:44.383498 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerName="extract" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.383508 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerName="extract" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.383649 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac3aa15-44b2-4d5f-bd77-94928a5e3a62" containerName="extract" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.384264 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.386752 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-g88gn" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.410781 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp"] Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.460101 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5cpg\" (UniqueName: \"kubernetes.io/projected/aa308d1e-c51d-4942-b7a0-a5b29e0649f0-kube-api-access-c5cpg\") pod \"openstack-operator-controller-operator-6799c88b79-8lfhp\" (UID: \"aa308d1e-c51d-4942-b7a0-a5b29e0649f0\") " pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.561668 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5cpg\" (UniqueName: \"kubernetes.io/projected/aa308d1e-c51d-4942-b7a0-a5b29e0649f0-kube-api-access-c5cpg\") pod \"openstack-operator-controller-operator-6799c88b79-8lfhp\" (UID: \"aa308d1e-c51d-4942-b7a0-a5b29e0649f0\") " pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.585848 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5cpg\" (UniqueName: \"kubernetes.io/projected/aa308d1e-c51d-4942-b7a0-a5b29e0649f0-kube-api-access-c5cpg\") pod \"openstack-operator-controller-operator-6799c88b79-8lfhp\" (UID: \"aa308d1e-c51d-4942-b7a0-a5b29e0649f0\") " pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.707384 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" Dec 09 03:30:44 crc kubenswrapper[4766]: I1209 03:30:44.939662 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp"] Dec 09 03:30:45 crc kubenswrapper[4766]: I1209 03:30:45.185656 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" event={"ID":"aa308d1e-c51d-4942-b7a0-a5b29e0649f0","Type":"ContainerStarted","Data":"029332d638711ae895d816bce719d555951aafb453b6d9a00a98ff6aa7aac7da"} Dec 09 03:30:49 crc kubenswrapper[4766]: I1209 03:30:49.210531 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" event={"ID":"aa308d1e-c51d-4942-b7a0-a5b29e0649f0","Type":"ContainerStarted","Data":"8b9a8d926630ba48bc53304a4a9fe11c927a62cb862d9a51cc74f416633dc42d"} Dec 09 03:30:49 crc kubenswrapper[4766]: I1209 03:30:49.211061 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" Dec 09 03:30:49 crc kubenswrapper[4766]: I1209 03:30:49.237001 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" podStartSLOduration=1.203017992 podStartE2EDuration="5.236983161s" podCreationTimestamp="2025-12-09 03:30:44 +0000 UTC" firstStartedPulling="2025-12-09 03:30:44.957620809 +0000 UTC m=+1126.666926235" lastFinishedPulling="2025-12-09 03:30:48.991585978 +0000 UTC m=+1130.700891404" observedRunningTime="2025-12-09 03:30:49.236585381 +0000 UTC m=+1130.945890817" watchObservedRunningTime="2025-12-09 03:30:49.236983161 +0000 UTC m=+1130.946288587" Dec 09 03:30:54 crc kubenswrapper[4766]: I1209 03:30:54.710747 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6799c88b79-8lfhp" Dec 09 03:31:07 crc kubenswrapper[4766]: I1209 03:31:07.316679 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:31:07 crc kubenswrapper[4766]: I1209 03:31:07.317488 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:31:07 crc kubenswrapper[4766]: I1209 03:31:07.317625 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:31:07 crc kubenswrapper[4766]: I1209 03:31:07.318684 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a0a8a7bff8971534685d42f33b8fda759b536f0edd8ff1b38fb2ef750399fde"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:31:07 crc kubenswrapper[4766]: I1209 03:31:07.318789 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://6a0a8a7bff8971534685d42f33b8fda759b536f0edd8ff1b38fb2ef750399fde" gracePeriod=600 Dec 09 03:31:08 crc kubenswrapper[4766]: I1209 03:31:08.357076 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="6a0a8a7bff8971534685d42f33b8fda759b536f0edd8ff1b38fb2ef750399fde" exitCode=0 Dec 09 03:31:08 crc kubenswrapper[4766]: I1209 03:31:08.357160 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"6a0a8a7bff8971534685d42f33b8fda759b536f0edd8ff1b38fb2ef750399fde"} Dec 09 03:31:08 crc kubenswrapper[4766]: I1209 03:31:08.357950 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"86ed0df3c1c9e8bf00e2182a8d0f2c1317600b335e2702ed7b54e17bf114fc74"} Dec 09 03:31:08 crc kubenswrapper[4766]: I1209 03:31:08.357993 4766 scope.go:117] "RemoveContainer" containerID="2f05dd63b19e38608a1f279fa1bc992ee750c24d1fb42681cf4619e9e86acf2d" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.823322 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.825026 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.828331 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xnf9q" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.839157 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.840616 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.843506 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-hnd2k" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.853256 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.866061 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.878272 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.879519 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.885244 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-z9xhf" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.905962 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-89h67"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.907045 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.909126 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crzps\" (UniqueName: \"kubernetes.io/projected/bb6ed96a-0045-42ea-a13e-dc4a82714b9d-kube-api-access-crzps\") pod \"designate-operator-controller-manager-697fb699cf-p2sks\" (UID: \"bb6ed96a-0045-42ea-a13e-dc4a82714b9d\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.909172 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng7lr\" (UniqueName: \"kubernetes.io/projected/ac56e057-6439-40ab-bb04-ce228a828444-kube-api-access-ng7lr\") pod \"cinder-operator-controller-manager-6c677c69b-vvdmw\" (UID: \"ac56e057-6439-40ab-bb04-ce228a828444\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.911734 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xrn22" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.913350 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj78b\" (UniqueName: \"kubernetes.io/projected/ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0-kube-api-access-rj78b\") pod \"barbican-operator-controller-manager-7d9dfd778-gc98f\" (UID: \"ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.913436 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnjhx\" (UniqueName: \"kubernetes.io/projected/95b141d2-5fb4-46f6-b5af-92720c9be11c-kube-api-access-cnjhx\") pod \"glance-operator-controller-manager-5697bb5779-89h67\" (UID: \"95b141d2-5fb4-46f6-b5af-92720c9be11c\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.916333 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.921514 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-89h67"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.941125 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.942080 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.948479 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rtj2h" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.954632 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.974273 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.975147 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97"] Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.975240 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" Dec 09 03:31:32 crc kubenswrapper[4766]: I1209 03:31:32.977492 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gwltc" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.007087 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.008024 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.012234 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.012471 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9497p" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.017417 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crzps\" (UniqueName: \"kubernetes.io/projected/bb6ed96a-0045-42ea-a13e-dc4a82714b9d-kube-api-access-crzps\") pod \"designate-operator-controller-manager-697fb699cf-p2sks\" (UID: \"bb6ed96a-0045-42ea-a13e-dc4a82714b9d\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.017476 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7lr\" (UniqueName: \"kubernetes.io/projected/ac56e057-6439-40ab-bb04-ce228a828444-kube-api-access-ng7lr\") pod \"cinder-operator-controller-manager-6c677c69b-vvdmw\" (UID: \"ac56e057-6439-40ab-bb04-ce228a828444\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.017519 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.017535 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8db5d\" (UniqueName: \"kubernetes.io/projected/735b0ee4-6bb2-41f4-b6e9-494e5d73b584-kube-api-access-8db5d\") pod \"horizon-operator-controller-manager-68c6d99b8f-6rt97\" (UID: \"735b0ee4-6bb2-41f4-b6e9-494e5d73b584\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.017554 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj78b\" (UniqueName: \"kubernetes.io/projected/ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0-kube-api-access-rj78b\") pod \"barbican-operator-controller-manager-7d9dfd778-gc98f\" (UID: \"ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.017573 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnjhx\" (UniqueName: \"kubernetes.io/projected/95b141d2-5fb4-46f6-b5af-92720c9be11c-kube-api-access-cnjhx\") pod \"glance-operator-controller-manager-5697bb5779-89h67\" (UID: \"95b141d2-5fb4-46f6-b5af-92720c9be11c\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.017588 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9zcd\" (UniqueName: \"kubernetes.io/projected/ef10710a-81c7-46f4-8c5c-3aabdc02a833-kube-api-access-s9zcd\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.017614 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6f9v\" (UniqueName: \"kubernetes.io/projected/146239d5-9360-4fa2-8a76-01743455b5f1-kube-api-access-k6f9v\") pod \"heat-operator-controller-manager-5f64f6f8bb-5z662\" (UID: \"146239d5-9360-4fa2-8a76-01743455b5f1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.025633 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.043592 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-ng459"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.047093 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.049980 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng7lr\" (UniqueName: \"kubernetes.io/projected/ac56e057-6439-40ab-bb04-ce228a828444-kube-api-access-ng7lr\") pod \"cinder-operator-controller-manager-6c677c69b-vvdmw\" (UID: \"ac56e057-6439-40ab-bb04-ce228a828444\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.050492 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-ng459"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.051818 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnjhx\" (UniqueName: \"kubernetes.io/projected/95b141d2-5fb4-46f6-b5af-92720c9be11c-kube-api-access-cnjhx\") pod \"glance-operator-controller-manager-5697bb5779-89h67\" (UID: \"95b141d2-5fb4-46f6-b5af-92720c9be11c\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.052024 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-l8sfc" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.059295 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.061073 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.063612 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-gvh66" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.064011 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crzps\" (UniqueName: \"kubernetes.io/projected/bb6ed96a-0045-42ea-a13e-dc4a82714b9d-kube-api-access-crzps\") pod \"designate-operator-controller-manager-697fb699cf-p2sks\" (UID: \"bb6ed96a-0045-42ea-a13e-dc4a82714b9d\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.064279 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj78b\" (UniqueName: \"kubernetes.io/projected/ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0-kube-api-access-rj78b\") pod \"barbican-operator-controller-manager-7d9dfd778-gc98f\" (UID: \"ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.071724 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.073424 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.076657 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rfc84" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.101070 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.102099 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.104368 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-p592v" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.114646 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.139435 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6f9v\" (UniqueName: \"kubernetes.io/projected/146239d5-9360-4fa2-8a76-01743455b5f1-kube-api-access-k6f9v\") pod \"heat-operator-controller-manager-5f64f6f8bb-5z662\" (UID: \"146239d5-9360-4fa2-8a76-01743455b5f1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.139913 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d442x\" (UniqueName: \"kubernetes.io/projected/b6f43661-b796-4f31-aaa9-482f85952578-kube-api-access-d442x\") pod \"manila-operator-controller-manager-5b5fd79c9c-shhtg\" (UID: \"b6f43661-b796-4f31-aaa9-482f85952578\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.143277 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpjvs\" (UniqueName: \"kubernetes.io/projected/e8d80517-4f1e-4123-b002-e7990cc9c945-kube-api-access-tpjvs\") pod \"ironic-operator-controller-manager-967d97867-ng459\" (UID: \"e8d80517-4f1e-4123-b002-e7990cc9c945\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.145464 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86xrs\" (UniqueName: \"kubernetes.io/projected/3467a8bf-3f6b-49ce-9e3a-5d834456bbaf-kube-api-access-86xrs\") pod \"mariadb-operator-controller-manager-79c8c4686c-mwbr2\" (UID: \"3467a8bf-3f6b-49ce-9e3a-5d834456bbaf\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.145570 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.145607 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8db5d\" (UniqueName: \"kubernetes.io/projected/735b0ee4-6bb2-41f4-b6e9-494e5d73b584-kube-api-access-8db5d\") pod \"horizon-operator-controller-manager-68c6d99b8f-6rt97\" (UID: \"735b0ee4-6bb2-41f4-b6e9-494e5d73b584\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.145642 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9zcd\" (UniqueName: \"kubernetes.io/projected/ef10710a-81c7-46f4-8c5c-3aabdc02a833-kube-api-access-s9zcd\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.145678 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx65k\" (UniqueName: \"kubernetes.io/projected/50ba0df7-74df-4798-a0c0-39dda9c4e3ef-kube-api-access-wx65k\") pod \"keystone-operator-controller-manager-7765d96ddf-7cdxh\" (UID: \"50ba0df7-74df-4798-a0c0-39dda9c4e3ef\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.146041 4766 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.146099 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert podName:ef10710a-81c7-46f4-8c5c-3aabdc02a833 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:33.646076558 +0000 UTC m=+1175.355381984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert") pod "infra-operator-controller-manager-78d48bff9d-sn9z2" (UID: "ef10710a-81c7-46f4-8c5c-3aabdc02a833") : secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.149900 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.165184 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.186504 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.197838 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9zcd\" (UniqueName: \"kubernetes.io/projected/ef10710a-81c7-46f4-8c5c-3aabdc02a833-kube-api-access-s9zcd\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.204414 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.212309 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8db5d\" (UniqueName: \"kubernetes.io/projected/735b0ee4-6bb2-41f4-b6e9-494e5d73b584-kube-api-access-8db5d\") pod \"horizon-operator-controller-manager-68c6d99b8f-6rt97\" (UID: \"735b0ee4-6bb2-41f4-b6e9-494e5d73b584\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.218350 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6f9v\" (UniqueName: \"kubernetes.io/projected/146239d5-9360-4fa2-8a76-01743455b5f1-kube-api-access-k6f9v\") pod \"heat-operator-controller-manager-5f64f6f8bb-5z662\" (UID: \"146239d5-9360-4fa2-8a76-01743455b5f1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.221272 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.231481 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.232431 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.233741 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.235620 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.235880 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-j6vq4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.246112 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.247237 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.249593 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-zwcv9" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.250769 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.252712 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.253063 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d442x\" (UniqueName: \"kubernetes.io/projected/b6f43661-b796-4f31-aaa9-482f85952578-kube-api-access-d442x\") pod \"manila-operator-controller-manager-5b5fd79c9c-shhtg\" (UID: \"b6f43661-b796-4f31-aaa9-482f85952578\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.253111 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpjvs\" (UniqueName: \"kubernetes.io/projected/e8d80517-4f1e-4123-b002-e7990cc9c945-kube-api-access-tpjvs\") pod \"ironic-operator-controller-manager-967d97867-ng459\" (UID: \"e8d80517-4f1e-4123-b002-e7990cc9c945\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.253140 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86xrs\" (UniqueName: \"kubernetes.io/projected/3467a8bf-3f6b-49ce-9e3a-5d834456bbaf-kube-api-access-86xrs\") pod \"mariadb-operator-controller-manager-79c8c4686c-mwbr2\" (UID: \"3467a8bf-3f6b-49ce-9e3a-5d834456bbaf\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.253243 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx65k\" (UniqueName: \"kubernetes.io/projected/50ba0df7-74df-4798-a0c0-39dda9c4e3ef-kube-api-access-wx65k\") pod \"keystone-operator-controller-manager-7765d96ddf-7cdxh\" (UID: \"50ba0df7-74df-4798-a0c0-39dda9c4e3ef\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.255290 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.256128 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-s4hl9" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.277492 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.277808 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.283441 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpjvs\" (UniqueName: \"kubernetes.io/projected/e8d80517-4f1e-4123-b002-e7990cc9c945-kube-api-access-tpjvs\") pod \"ironic-operator-controller-manager-967d97867-ng459\" (UID: \"e8d80517-4f1e-4123-b002-e7990cc9c945\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.295909 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d442x\" (UniqueName: \"kubernetes.io/projected/b6f43661-b796-4f31-aaa9-482f85952578-kube-api-access-d442x\") pod \"manila-operator-controller-manager-5b5fd79c9c-shhtg\" (UID: \"b6f43661-b796-4f31-aaa9-482f85952578\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.298835 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.300625 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.305619 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.306886 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.307787 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx65k\" (UniqueName: \"kubernetes.io/projected/50ba0df7-74df-4798-a0c0-39dda9c4e3ef-kube-api-access-wx65k\") pod \"keystone-operator-controller-manager-7765d96ddf-7cdxh\" (UID: \"50ba0df7-74df-4798-a0c0-39dda9c4e3ef\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.309445 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7plqd" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.309824 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86xrs\" (UniqueName: \"kubernetes.io/projected/3467a8bf-3f6b-49ce-9e3a-5d834456bbaf-kube-api-access-86xrs\") pod \"mariadb-operator-controller-manager-79c8c4686c-mwbr2\" (UID: \"3467a8bf-3f6b-49ce-9e3a-5d834456bbaf\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.356048 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwsfl\" (UniqueName: \"kubernetes.io/projected/d1aa5579-7f42-4ee2-af62-3b2a7d391101-kube-api-access-hwsfl\") pod \"octavia-operator-controller-manager-998648c74-rb9kw\" (UID: \"d1aa5579-7f42-4ee2-af62-3b2a7d391101\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.356363 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dmfd\" (UniqueName: \"kubernetes.io/projected/14e6b34f-8412-44ea-8342-0350ccf7f7c9-kube-api-access-5dmfd\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dgds5\" (UID: \"14e6b34f-8412-44ea-8342-0350ccf7f7c9\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.356392 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.356459 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f5tz\" (UniqueName: \"kubernetes.io/projected/b2580526-5f4a-4340-9856-bc78a320e610-kube-api-access-8f5tz\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.356505 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9lhs\" (UniqueName: \"kubernetes.io/projected/3b4cd02e-a82d-48d9-8078-9cdf3a65767c-kube-api-access-z9lhs\") pod \"nova-operator-controller-manager-697bc559fc-qfdx4\" (UID: \"3b4cd02e-a82d-48d9-8078-9cdf3a65767c\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.356582 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.357483 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.357506 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-rllsr"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.358226 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.369975 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.377746 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-w88cz" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.377917 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-rfmkj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.392290 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.413658 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-rllsr"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.440274 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.441417 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.442129 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.446387 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-w4h5g" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.452310 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.457282 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9lhs\" (UniqueName: \"kubernetes.io/projected/3b4cd02e-a82d-48d9-8078-9cdf3a65767c-kube-api-access-z9lhs\") pod \"nova-operator-controller-manager-697bc559fc-qfdx4\" (UID: \"3b4cd02e-a82d-48d9-8078-9cdf3a65767c\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.457330 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-822l4\" (UniqueName: \"kubernetes.io/projected/0d1a15a1-afd0-41a2-bf4d-21d98d4730b4-kube-api-access-822l4\") pod \"ovn-operator-controller-manager-b6456fdb6-ln78g\" (UID: \"0d1a15a1-afd0-41a2-bf4d-21d98d4730b4\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.457376 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvzdq\" (UniqueName: \"kubernetes.io/projected/3aea0f89-2c0b-4705-92cf-17a37169675e-kube-api-access-mvzdq\") pod \"swift-operator-controller-manager-9d58d64bc-whnqg\" (UID: \"3aea0f89-2c0b-4705-92cf-17a37169675e\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.457404 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwsfl\" (UniqueName: \"kubernetes.io/projected/d1aa5579-7f42-4ee2-af62-3b2a7d391101-kube-api-access-hwsfl\") pod \"octavia-operator-controller-manager-998648c74-rb9kw\" (UID: \"d1aa5579-7f42-4ee2-af62-3b2a7d391101\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.457431 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dmfd\" (UniqueName: \"kubernetes.io/projected/14e6b34f-8412-44ea-8342-0350ccf7f7c9-kube-api-access-5dmfd\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dgds5\" (UID: \"14e6b34f-8412-44ea-8342-0350ccf7f7c9\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.457465 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.457537 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f5tz\" (UniqueName: \"kubernetes.io/projected/b2580526-5f4a-4340-9856-bc78a320e610-kube-api-access-8f5tz\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.457560 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wscx9\" (UniqueName: \"kubernetes.io/projected/636d7080-c310-4a2a-a07c-ef0aab6412ba-kube-api-access-wscx9\") pod \"placement-operator-controller-manager-78f8948974-rllsr\" (UID: \"636d7080-c310-4a2a-a07c-ef0aab6412ba\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.458812 4766 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.458873 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert podName:b2580526-5f4a-4340-9856-bc78a320e610 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:33.958852734 +0000 UTC m=+1175.668158160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fbrcdj" (UID: "b2580526-5f4a-4340-9856-bc78a320e610") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.459443 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.470762 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.499279 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwsfl\" (UniqueName: \"kubernetes.io/projected/d1aa5579-7f42-4ee2-af62-3b2a7d391101-kube-api-access-hwsfl\") pod \"octavia-operator-controller-manager-998648c74-rb9kw\" (UID: \"d1aa5579-7f42-4ee2-af62-3b2a7d391101\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.510113 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.511254 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.521978 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f5tz\" (UniqueName: \"kubernetes.io/projected/b2580526-5f4a-4340-9856-bc78a320e610-kube-api-access-8f5tz\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.533559 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.535456 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9lhs\" (UniqueName: \"kubernetes.io/projected/3b4cd02e-a82d-48d9-8078-9cdf3a65767c-kube-api-access-z9lhs\") pod \"nova-operator-controller-manager-697bc559fc-qfdx4\" (UID: \"3b4cd02e-a82d-48d9-8078-9cdf3a65767c\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.535892 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.551650 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-wzz2w" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.552964 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dmfd\" (UniqueName: \"kubernetes.io/projected/14e6b34f-8412-44ea-8342-0350ccf7f7c9-kube-api-access-5dmfd\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-dgds5\" (UID: \"14e6b34f-8412-44ea-8342-0350ccf7f7c9\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.560877 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgcjh\" (UniqueName: \"kubernetes.io/projected/f911184d-8572-4bc3-abbd-b76245ce463c-kube-api-access-mgcjh\") pod \"telemetry-operator-controller-manager-58d5ff84df-fk6b4\" (UID: \"f911184d-8572-4bc3-abbd-b76245ce463c\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.560961 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wscx9\" (UniqueName: \"kubernetes.io/projected/636d7080-c310-4a2a-a07c-ef0aab6412ba-kube-api-access-wscx9\") pod \"placement-operator-controller-manager-78f8948974-rllsr\" (UID: \"636d7080-c310-4a2a-a07c-ef0aab6412ba\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.560997 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-822l4\" (UniqueName: \"kubernetes.io/projected/0d1a15a1-afd0-41a2-bf4d-21d98d4730b4-kube-api-access-822l4\") pod \"ovn-operator-controller-manager-b6456fdb6-ln78g\" (UID: \"0d1a15a1-afd0-41a2-bf4d-21d98d4730b4\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.561025 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvzdq\" (UniqueName: \"kubernetes.io/projected/3aea0f89-2c0b-4705-92cf-17a37169675e-kube-api-access-mvzdq\") pod \"swift-operator-controller-manager-9d58d64bc-whnqg\" (UID: \"3aea0f89-2c0b-4705-92cf-17a37169675e\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.580674 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.587056 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-822l4\" (UniqueName: \"kubernetes.io/projected/0d1a15a1-afd0-41a2-bf4d-21d98d4730b4-kube-api-access-822l4\") pod \"ovn-operator-controller-manager-b6456fdb6-ln78g\" (UID: \"0d1a15a1-afd0-41a2-bf4d-21d98d4730b4\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.595183 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvzdq\" (UniqueName: \"kubernetes.io/projected/3aea0f89-2c0b-4705-92cf-17a37169675e-kube-api-access-mvzdq\") pod \"swift-operator-controller-manager-9d58d64bc-whnqg\" (UID: \"3aea0f89-2c0b-4705-92cf-17a37169675e\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.608247 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wscx9\" (UniqueName: \"kubernetes.io/projected/636d7080-c310-4a2a-a07c-ef0aab6412ba-kube-api-access-wscx9\") pod \"placement-operator-controller-manager-78f8948974-rllsr\" (UID: \"636d7080-c310-4a2a-a07c-ef0aab6412ba\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.617612 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.634940 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-62wlz"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.643916 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.646714 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xx5fx" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.655530 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-62wlz"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.662398 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.662483 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq2pc\" (UniqueName: \"kubernetes.io/projected/4fa83023-eb1d-4ab9-b80a-b9bd04d342a3-kube-api-access-pq2pc\") pod \"test-operator-controller-manager-5854674fcc-62wlz\" (UID: \"4fa83023-eb1d-4ab9-b80a-b9bd04d342a3\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.662512 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgcjh\" (UniqueName: \"kubernetes.io/projected/f911184d-8572-4bc3-abbd-b76245ce463c-kube-api-access-mgcjh\") pod \"telemetry-operator-controller-manager-58d5ff84df-fk6b4\" (UID: \"f911184d-8572-4bc3-abbd-b76245ce463c\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.662969 4766 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.663007 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert podName:ef10710a-81c7-46f4-8c5c-3aabdc02a833 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:34.662993256 +0000 UTC m=+1176.372298682 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert") pod "infra-operator-controller-manager-78d48bff9d-sn9z2" (UID: "ef10710a-81c7-46f4-8c5c-3aabdc02a833") : secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.697227 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgcjh\" (UniqueName: \"kubernetes.io/projected/f911184d-8572-4bc3-abbd-b76245ce463c-kube-api-access-mgcjh\") pod \"telemetry-operator-controller-manager-58d5ff84df-fk6b4\" (UID: \"f911184d-8572-4bc3-abbd-b76245ce463c\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.709506 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.727803 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.727929 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.736413 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-pjc8w" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.767644 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq2pc\" (UniqueName: \"kubernetes.io/projected/4fa83023-eb1d-4ab9-b80a-b9bd04d342a3-kube-api-access-pq2pc\") pod \"test-operator-controller-manager-5854674fcc-62wlz\" (UID: \"4fa83023-eb1d-4ab9-b80a-b9bd04d342a3\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.767912 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh95n\" (UniqueName: \"kubernetes.io/projected/c3606954-b281-4fa3-b96a-a62ab4092a78-kube-api-access-zh95n\") pod \"watcher-operator-controller-manager-667bd8d554-74tq4\" (UID: \"c3606954-b281-4fa3-b96a-a62ab4092a78\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.782281 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.791997 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq2pc\" (UniqueName: \"kubernetes.io/projected/4fa83023-eb1d-4ab9-b80a-b9bd04d342a3-kube-api-access-pq2pc\") pod \"test-operator-controller-manager-5854674fcc-62wlz\" (UID: \"4fa83023-eb1d-4ab9-b80a-b9bd04d342a3\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.822716 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.823662 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.825416 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.825574 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.826571 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5l4lg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.835394 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.843357 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.851803 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.854944 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.861363 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.864084 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.867303 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-xn7xv" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.868870 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.868978 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh95n\" (UniqueName: \"kubernetes.io/projected/c3606954-b281-4fa3-b96a-a62ab4092a78-kube-api-access-zh95n\") pod \"watcher-operator-controller-manager-667bd8d554-74tq4\" (UID: \"c3606954-b281-4fa3-b96a-a62ab4092a78\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.869015 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmxz\" (UniqueName: \"kubernetes.io/projected/a315c06a-893b-4f1b-9b0e-0120afcfeb00-kube-api-access-ccmxz\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.869052 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.888510 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.888968 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.910283 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh95n\" (UniqueName: \"kubernetes.io/projected/c3606954-b281-4fa3-b96a-a62ab4092a78-kube-api-access-zh95n\") pod \"watcher-operator-controller-manager-667bd8d554-74tq4\" (UID: \"c3606954-b281-4fa3-b96a-a62ab4092a78\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.930146 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.946645 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks"] Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.973168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmxz\" (UniqueName: \"kubernetes.io/projected/a315c06a-893b-4f1b-9b0e-0120afcfeb00-kube-api-access-ccmxz\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.973236 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.973279 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.973331 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.973360 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvw2\" (UniqueName: \"kubernetes.io/projected/581540bb-4d83-41b5-9384-e91238383025-kube-api-access-7gvw2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dxctt\" (UID: \"581540bb-4d83-41b5-9384-e91238383025\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.973445 4766 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.973501 4766 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.973513 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:34.473495781 +0000 UTC m=+1176.182801207 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "metrics-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.973445 4766 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.973544 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert podName:b2580526-5f4a-4340-9856-bc78a320e610 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:34.973529662 +0000 UTC m=+1176.682835078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fbrcdj" (UID: "b2580526-5f4a-4340-9856-bc78a320e610") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: E1209 03:31:33.973590 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:34.473577073 +0000 UTC m=+1176.182882499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "webhook-server-cert" not found Dec 09 03:31:33 crc kubenswrapper[4766]: I1209 03:31:33.996300 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmxz\" (UniqueName: \"kubernetes.io/projected/a315c06a-893b-4f1b-9b0e-0120afcfeb00-kube-api-access-ccmxz\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.053399 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.074508 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvw2\" (UniqueName: \"kubernetes.io/projected/581540bb-4d83-41b5-9384-e91238383025-kube-api-access-7gvw2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dxctt\" (UID: \"581540bb-4d83-41b5-9384-e91238383025\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.095349 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvw2\" (UniqueName: \"kubernetes.io/projected/581540bb-4d83-41b5-9384-e91238383025-kube-api-access-7gvw2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dxctt\" (UID: \"581540bb-4d83-41b5-9384-e91238383025\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.117758 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.205811 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.674586 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.674680 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.674711 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.674775 4766 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.674850 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:35.674828142 +0000 UTC m=+1177.384133678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "metrics-server-cert" not found Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.674870 4766 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.674909 4766 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.674927 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert podName:ef10710a-81c7-46f4-8c5c-3aabdc02a833 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:36.674908364 +0000 UTC m=+1178.384213880 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert") pod "infra-operator-controller-manager-78d48bff9d-sn9z2" (UID: "ef10710a-81c7-46f4-8c5c-3aabdc02a833") : secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.674949 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:35.674936555 +0000 UTC m=+1177.384242081 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "webhook-server-cert" not found Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.689994 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" event={"ID":"bb6ed96a-0045-42ea-a13e-dc4a82714b9d","Type":"ContainerStarted","Data":"7b7088bcb98fd8d59202db774599674189697fc74bd7343f4bb0d53b1113c991"} Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.721245 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" event={"ID":"ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0","Type":"ContainerStarted","Data":"92de0efd4e562728cd86b8f923b6dd0ef3fc7cb5211176656bb3cfc407abe56c"} Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.742867 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.758041 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.768343 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-ng459"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.773201 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.804521 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.816538 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-89h67"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.832497 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.839229 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97"] Dec 09 03:31:34 crc kubenswrapper[4766]: W1209 03:31:34.852281 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aea0f89_2c0b_4705_92cf_17a37169675e.slice/crio-f47e5060ef61a0aa849be9ee13e705175b2711db9cf4546262f177d0e8f99bd7 WatchSource:0}: Error finding container f47e5060ef61a0aa849be9ee13e705175b2711db9cf4546262f177d0e8f99bd7: Status 404 returned error can't find the container with id f47e5060ef61a0aa849be9ee13e705175b2711db9cf4546262f177d0e8f99bd7 Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.868150 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wscx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-rllsr_openstack-operators(636d7080-c310-4a2a-a07c-ef0aab6412ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.868645 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.868683 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.868698 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-rllsr"] Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.871549 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wscx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-rllsr_openstack-operators(636d7080-c310-4a2a-a07c-ef0aab6412ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.871655 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z9lhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-qfdx4_openstack-operators(3b4cd02e-a82d-48d9-8078-9cdf3a65767c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.873549 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" podUID="636d7080-c310-4a2a-a07c-ef0aab6412ba" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.875251 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z9lhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-qfdx4_openstack-operators(3b4cd02e-a82d-48d9-8078-9cdf3a65767c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.876329 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" podUID="3b4cd02e-a82d-48d9-8078-9cdf3a65767c" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.879950 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5dmfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-dgds5_openstack-operators(14e6b34f-8412-44ea-8342-0350ccf7f7c9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.880273 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5"] Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.881917 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5dmfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-dgds5_openstack-operators(14e6b34f-8412-44ea-8342-0350ccf7f7c9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.883110 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" podUID="14e6b34f-8412-44ea-8342-0350ccf7f7c9" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.892229 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7gvw2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-dxctt_openstack-operators(581540bb-4d83-41b5-9384-e91238383025): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.893428 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" podUID="581540bb-4d83-41b5-9384-e91238383025" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.896816 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.903161 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.962467 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.967507 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g"] Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.972166 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-62wlz"] Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.978760 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-822l4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-ln78g_openstack-operators(0d1a15a1-afd0-41a2-bf4d-21d98d4730b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.978883 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwsfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-rb9kw_openstack-operators(d1aa5579-7f42-4ee2-af62-3b2a7d391101): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.979057 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pq2pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-62wlz_openstack-operators(4fa83023-eb1d-4ab9-b80a-b9bd04d342a3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.981044 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-822l4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-ln78g_openstack-operators(0d1a15a1-afd0-41a2-bf4d-21d98d4730b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.981065 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwsfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-rb9kw_openstack-operators(d1aa5579-7f42-4ee2-af62-3b2a7d391101): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.981703 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.982451 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" podUID="0d1a15a1-afd0-41a2-bf4d-21d98d4730b4" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.983670 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" podUID="d1aa5579-7f42-4ee2-af62-3b2a7d391101" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.983728 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pq2pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-62wlz_openstack-operators(4fa83023-eb1d-4ab9-b80a-b9bd04d342a3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.984032 4766 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.984066 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert podName:b2580526-5f4a-4340-9856-bc78a320e610 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:36.984052681 +0000 UTC m=+1178.693358107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fbrcdj" (UID: "b2580526-5f4a-4340-9856-bc78a320e610") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:34 crc kubenswrapper[4766]: E1209 03:31:34.985231 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" podUID="4fa83023-eb1d-4ab9-b80a-b9bd04d342a3" Dec 09 03:31:34 crc kubenswrapper[4766]: I1209 03:31:34.989884 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4"] Dec 09 03:31:35 crc kubenswrapper[4766]: W1209 03:31:35.008449 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf911184d_8572_4bc3_abbd_b76245ce463c.slice/crio-190430ab28813808626a6cd3856f59b26784723e5fc6f32cf691fd821494732a WatchSource:0}: Error finding container 190430ab28813808626a6cd3856f59b26784723e5fc6f32cf691fd821494732a: Status 404 returned error can't find the container with id 190430ab28813808626a6cd3856f59b26784723e5fc6f32cf691fd821494732a Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.696834 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.697194 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.697405 4766 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.697463 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:37.697444506 +0000 UTC m=+1179.406749942 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "webhook-server-cert" not found Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.697871 4766 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.697922 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:37.697903399 +0000 UTC m=+1179.407208825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "metrics-server-cert" not found Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.740842 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" event={"ID":"735b0ee4-6bb2-41f4-b6e9-494e5d73b584","Type":"ContainerStarted","Data":"e4d0b389d93c383f9708326013d2894fa834951202dd5f2b025c3a49c66c1fab"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.744183 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" event={"ID":"ac56e057-6439-40ab-bb04-ce228a828444","Type":"ContainerStarted","Data":"4110838897c06ec879cf951676436211f45a34c757bce95f712757a3062608ef"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.745612 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" event={"ID":"e8d80517-4f1e-4123-b002-e7990cc9c945","Type":"ContainerStarted","Data":"f2837790b958bb140451c9e10f93da649c1390167fae831303d39e0c5938599b"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.746799 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" event={"ID":"14e6b34f-8412-44ea-8342-0350ccf7f7c9","Type":"ContainerStarted","Data":"8eb83907970df0e3a43b7821eed189a3dada11d8c26b5b52a679120df162bc4c"} Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.752320 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" podUID="14e6b34f-8412-44ea-8342-0350ccf7f7c9" Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.758730 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" event={"ID":"636d7080-c310-4a2a-a07c-ef0aab6412ba","Type":"ContainerStarted","Data":"95facfb925343d5278b99127f3784fe4a0e898fcfcff354e5845a7c22fa48810"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.760441 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" event={"ID":"50ba0df7-74df-4798-a0c0-39dda9c4e3ef","Type":"ContainerStarted","Data":"28c33f96fcb599ab203a8c6dbe1cd6020a6ff4a50b41f61bcba7516ef9e036b3"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.763520 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" event={"ID":"b6f43661-b796-4f31-aaa9-482f85952578","Type":"ContainerStarted","Data":"ffc402d4a85282837854b88aa2067d6e5fd527715f6c7333b1f66a68dc917464"} Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.763769 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" podUID="636d7080-c310-4a2a-a07c-ef0aab6412ba" Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.764896 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" event={"ID":"95b141d2-5fb4-46f6-b5af-92720c9be11c","Type":"ContainerStarted","Data":"de1184c0ae14656779a887ae1d1c3cbcc3b7e286e3bb0d69c12f303232c32278"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.766033 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" event={"ID":"581540bb-4d83-41b5-9384-e91238383025","Type":"ContainerStarted","Data":"ca1b7d6dc8c811500251395181eace92fcb2db6d8b55622f2c1ba9b386204407"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.767062 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" event={"ID":"3aea0f89-2c0b-4705-92cf-17a37169675e","Type":"ContainerStarted","Data":"f47e5060ef61a0aa849be9ee13e705175b2711db9cf4546262f177d0e8f99bd7"} Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.773556 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" podUID="581540bb-4d83-41b5-9384-e91238383025" Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.779189 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" event={"ID":"f911184d-8572-4bc3-abbd-b76245ce463c","Type":"ContainerStarted","Data":"190430ab28813808626a6cd3856f59b26784723e5fc6f32cf691fd821494732a"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.783416 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" event={"ID":"c3606954-b281-4fa3-b96a-a62ab4092a78","Type":"ContainerStarted","Data":"559f8d84ffd3055e580ae3dd1e1b9a2fede0f4915b7b49222616281e7aa3567f"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.784878 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" event={"ID":"146239d5-9360-4fa2-8a76-01743455b5f1","Type":"ContainerStarted","Data":"90d7015ab7e15b80e1f1ddf8c422a09aa1658d8a41057ea9772c3c65763eecb6"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.793439 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" event={"ID":"0d1a15a1-afd0-41a2-bf4d-21d98d4730b4","Type":"ContainerStarted","Data":"7e59f329fce2923f4e8642c6da5b09ca19d73b6c34a0d2ac5e1fb86ecc4515b8"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.796064 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" event={"ID":"3467a8bf-3f6b-49ce-9e3a-5d834456bbaf","Type":"ContainerStarted","Data":"258c373f17b8523387ca76e6adc6b590b25c9f60b5e7201377e323ae63523b6a"} Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.796262 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" podUID="0d1a15a1-afd0-41a2-bf4d-21d98d4730b4" Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.797024 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" event={"ID":"4fa83023-eb1d-4ab9-b80a-b9bd04d342a3","Type":"ContainerStarted","Data":"05b6446c185c6f35484b7e613d19787760d3db6c89d65f04c69237e4ed695de9"} Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.799227 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" podUID="4fa83023-eb1d-4ab9-b80a-b9bd04d342a3" Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.799515 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" event={"ID":"d1aa5579-7f42-4ee2-af62-3b2a7d391101","Type":"ContainerStarted","Data":"222a86a1ab1d76a6378977947e9fcf9dd9640cfcd27e44eccb88e66129ba1b33"} Dec 09 03:31:35 crc kubenswrapper[4766]: I1209 03:31:35.801057 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" event={"ID":"3b4cd02e-a82d-48d9-8078-9cdf3a65767c","Type":"ContainerStarted","Data":"0987f0a1a43c1228f6a4792aa9793febbb4f7ead5c05eb7baf8b2dcff597e825"} Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.801990 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" podUID="d1aa5579-7f42-4ee2-af62-3b2a7d391101" Dec 09 03:31:35 crc kubenswrapper[4766]: E1209 03:31:35.806644 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" podUID="3b4cd02e-a82d-48d9-8078-9cdf3a65767c" Dec 09 03:31:36 crc kubenswrapper[4766]: I1209 03:31:36.721400 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:36 crc kubenswrapper[4766]: E1209 03:31:36.723582 4766 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:36 crc kubenswrapper[4766]: E1209 03:31:36.723680 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert podName:ef10710a-81c7-46f4-8c5c-3aabdc02a833 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:40.723665138 +0000 UTC m=+1182.432970564 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert") pod "infra-operator-controller-manager-78d48bff9d-sn9z2" (UID: "ef10710a-81c7-46f4-8c5c-3aabdc02a833") : secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:36 crc kubenswrapper[4766]: E1209 03:31:36.836224 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" podUID="581540bb-4d83-41b5-9384-e91238383025" Dec 09 03:31:36 crc kubenswrapper[4766]: E1209 03:31:36.838941 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" podUID="3b4cd02e-a82d-48d9-8078-9cdf3a65767c" Dec 09 03:31:36 crc kubenswrapper[4766]: E1209 03:31:36.839030 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" podUID="14e6b34f-8412-44ea-8342-0350ccf7f7c9" Dec 09 03:31:36 crc kubenswrapper[4766]: E1209 03:31:36.839114 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" podUID="636d7080-c310-4a2a-a07c-ef0aab6412ba" Dec 09 03:31:36 crc kubenswrapper[4766]: E1209 03:31:36.839194 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" podUID="4fa83023-eb1d-4ab9-b80a-b9bd04d342a3" Dec 09 03:31:36 crc kubenswrapper[4766]: E1209 03:31:36.839367 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" podUID="d1aa5579-7f42-4ee2-af62-3b2a7d391101" Dec 09 03:31:36 crc kubenswrapper[4766]: E1209 03:31:36.839811 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" podUID="0d1a15a1-afd0-41a2-bf4d-21d98d4730b4" Dec 09 03:31:37 crc kubenswrapper[4766]: I1209 03:31:37.025900 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:37 crc kubenswrapper[4766]: E1209 03:31:37.026137 4766 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:37 crc kubenswrapper[4766]: E1209 03:31:37.026193 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert podName:b2580526-5f4a-4340-9856-bc78a320e610 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:41.026179028 +0000 UTC m=+1182.735484444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fbrcdj" (UID: "b2580526-5f4a-4340-9856-bc78a320e610") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:37 crc kubenswrapper[4766]: I1209 03:31:37.738954 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:37 crc kubenswrapper[4766]: I1209 03:31:37.739511 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:37 crc kubenswrapper[4766]: E1209 03:31:37.739402 4766 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 03:31:37 crc kubenswrapper[4766]: E1209 03:31:37.739609 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:41.739587883 +0000 UTC m=+1183.448893309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "metrics-server-cert" not found Dec 09 03:31:37 crc kubenswrapper[4766]: E1209 03:31:37.739675 4766 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 03:31:37 crc kubenswrapper[4766]: E1209 03:31:37.739730 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:41.739714446 +0000 UTC m=+1183.449019862 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "webhook-server-cert" not found Dec 09 03:31:40 crc kubenswrapper[4766]: I1209 03:31:40.777468 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:40 crc kubenswrapper[4766]: E1209 03:31:40.777675 4766 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:40 crc kubenswrapper[4766]: E1209 03:31:40.777766 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert podName:ef10710a-81c7-46f4-8c5c-3aabdc02a833 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:48.777743888 +0000 UTC m=+1190.487049324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert") pod "infra-operator-controller-manager-78d48bff9d-sn9z2" (UID: "ef10710a-81c7-46f4-8c5c-3aabdc02a833") : secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:41 crc kubenswrapper[4766]: I1209 03:31:41.086624 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:41 crc kubenswrapper[4766]: E1209 03:31:41.086765 4766 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:41 crc kubenswrapper[4766]: E1209 03:31:41.086851 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert podName:b2580526-5f4a-4340-9856-bc78a320e610 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:49.086829885 +0000 UTC m=+1190.796135311 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fbrcdj" (UID: "b2580526-5f4a-4340-9856-bc78a320e610") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:41 crc kubenswrapper[4766]: I1209 03:31:41.795861 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:41 crc kubenswrapper[4766]: E1209 03:31:41.796104 4766 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 03:31:41 crc kubenswrapper[4766]: I1209 03:31:41.796271 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:41 crc kubenswrapper[4766]: E1209 03:31:41.796326 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:49.796306284 +0000 UTC m=+1191.505611710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "metrics-server-cert" not found Dec 09 03:31:41 crc kubenswrapper[4766]: E1209 03:31:41.796392 4766 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 03:31:41 crc kubenswrapper[4766]: E1209 03:31:41.796446 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:31:49.796432787 +0000 UTC m=+1191.505738213 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "webhook-server-cert" not found Dec 09 03:31:47 crc kubenswrapper[4766]: E1209 03:31:47.627056 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8" Dec 09 03:31:47 crc kubenswrapper[4766]: E1209 03:31:47.627945 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zh95n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-74tq4_openstack-operators(c3606954-b281-4fa3-b96a-a62ab4092a78): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:31:48 crc kubenswrapper[4766]: E1209 03:31:48.247453 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-86xrs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-mwbr2_openstack-operators(3467a8bf-3f6b-49ce-9e3a-5d834456bbaf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:48 crc kubenswrapper[4766]: E1209 03:31:48.252155 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" podUID="3467a8bf-3f6b-49ce-9e3a-5d834456bbaf" Dec 09 03:31:48 crc kubenswrapper[4766]: E1209 03:31:48.288876 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wx65k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-7cdxh_openstack-operators(50ba0df7-74df-4798-a0c0-39dda9c4e3ef): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 09 03:31:48 crc kubenswrapper[4766]: E1209 03:31:48.290043 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" podUID="50ba0df7-74df-4798-a0c0-39dda9c4e3ef" Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.802385 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:31:48 crc kubenswrapper[4766]: E1209 03:31:48.802577 4766 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:48 crc kubenswrapper[4766]: E1209 03:31:48.802645 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert podName:ef10710a-81c7-46f4-8c5c-3aabdc02a833 nodeName:}" failed. No retries permitted until 2025-12-09 03:32:04.802626388 +0000 UTC m=+1206.511931814 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert") pod "infra-operator-controller-manager-78d48bff9d-sn9z2" (UID: "ef10710a-81c7-46f4-8c5c-3aabdc02a833") : secret "infra-operator-webhook-server-cert" not found Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.936096 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" event={"ID":"b6f43661-b796-4f31-aaa9-482f85952578","Type":"ContainerStarted","Data":"6cf11e1d1203ed80012b7a734b96f8e6273bbac84d74a8bc49621f724e88143c"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.937375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" event={"ID":"735b0ee4-6bb2-41f4-b6e9-494e5d73b584","Type":"ContainerStarted","Data":"b6ef0e7c37be79d48e09a318d873617bba944db055ee4f05d41b2e33ffa72670"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.938919 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" event={"ID":"f911184d-8572-4bc3-abbd-b76245ce463c","Type":"ContainerStarted","Data":"362a10ad13f532ea5c8bd552e0fb5319bd8f090dead758a2ec68bca3fc74b114"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.940264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" event={"ID":"ac56e057-6439-40ab-bb04-ce228a828444","Type":"ContainerStarted","Data":"195d160979725bba9e3fdf6c559d4e65b50c72e4c11f8034c053a003576d1866"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.941384 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" event={"ID":"3467a8bf-3f6b-49ce-9e3a-5d834456bbaf","Type":"ContainerStarted","Data":"dd18057284f57ff15aa826716f62fde762b8cf762234d62d6d77c5b080aad31a"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.941509 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" Dec 09 03:31:48 crc kubenswrapper[4766]: E1209 03:31:48.942534 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" podUID="3467a8bf-3f6b-49ce-9e3a-5d834456bbaf" Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.942794 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" event={"ID":"ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0","Type":"ContainerStarted","Data":"2e63f3dd48920233c5f3a88363f6f78aa9700a25e6b705d05042c1e22cfa8c4c"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.943899 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" event={"ID":"50ba0df7-74df-4798-a0c0-39dda9c4e3ef","Type":"ContainerStarted","Data":"f2e89b1e19e9d8f5a19590e6bfe1717d95e3d9989bf78295eee359109553436e"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.944026 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" Dec 09 03:31:48 crc kubenswrapper[4766]: E1209 03:31:48.944826 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" podUID="50ba0df7-74df-4798-a0c0-39dda9c4e3ef" Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.945164 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" event={"ID":"95b141d2-5fb4-46f6-b5af-92720c9be11c","Type":"ContainerStarted","Data":"2274581ba1e0094fb942d088d3208c36b5d4bc3c256d94203f7318341d415620"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.946348 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" event={"ID":"146239d5-9360-4fa2-8a76-01743455b5f1","Type":"ContainerStarted","Data":"2bd2044d594e5e6bd975e2b09f7ae3c807e404ffd0adae64b3af582097d9060c"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.947483 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" event={"ID":"e8d80517-4f1e-4123-b002-e7990cc9c945","Type":"ContainerStarted","Data":"2d867a8723e9a8f260c5b123625e8e4bb9396bf3d91593adb510ee66d65e00dd"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.948988 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" event={"ID":"3aea0f89-2c0b-4705-92cf-17a37169675e","Type":"ContainerStarted","Data":"6cddfa9e63bbea4ff688bb7a1b79d9b1627ae6364a6bb00bbba387c0cf59dfe4"} Dec 09 03:31:48 crc kubenswrapper[4766]: I1209 03:31:48.950345 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" event={"ID":"bb6ed96a-0045-42ea-a13e-dc4a82714b9d","Type":"ContainerStarted","Data":"7e89ccb40a99fa28e407a65513a164cca21ac962cb01b4552997b046cc450450"} Dec 09 03:31:49 crc kubenswrapper[4766]: I1209 03:31:49.106410 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:31:49 crc kubenswrapper[4766]: E1209 03:31:49.106684 4766 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:49 crc kubenswrapper[4766]: E1209 03:31:49.106780 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert podName:b2580526-5f4a-4340-9856-bc78a320e610 nodeName:}" failed. No retries permitted until 2025-12-09 03:32:05.106761051 +0000 UTC m=+1206.816066477 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fbrcdj" (UID: "b2580526-5f4a-4340-9856-bc78a320e610") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 09 03:31:49 crc kubenswrapper[4766]: I1209 03:31:49.815222 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:49 crc kubenswrapper[4766]: I1209 03:31:49.815297 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:31:49 crc kubenswrapper[4766]: E1209 03:31:49.815394 4766 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 09 03:31:49 crc kubenswrapper[4766]: E1209 03:31:49.815490 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:32:05.81547233 +0000 UTC m=+1207.524777756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "metrics-server-cert" not found Dec 09 03:31:49 crc kubenswrapper[4766]: E1209 03:31:49.815504 4766 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 09 03:31:49 crc kubenswrapper[4766]: E1209 03:31:49.815579 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs podName:a315c06a-893b-4f1b-9b0e-0120afcfeb00 nodeName:}" failed. No retries permitted until 2025-12-09 03:32:05.815561652 +0000 UTC m=+1207.524867078 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs") pod "openstack-operator-controller-manager-646dd6f965-mkjpj" (UID: "a315c06a-893b-4f1b-9b0e-0120afcfeb00") : secret "webhook-server-cert" not found Dec 09 03:31:49 crc kubenswrapper[4766]: E1209 03:31:49.957597 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" podUID="50ba0df7-74df-4798-a0c0-39dda9c4e3ef" Dec 09 03:31:49 crc kubenswrapper[4766]: E1209 03:31:49.957661 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" podUID="3467a8bf-3f6b-49ce-9e3a-5d834456bbaf" Dec 09 03:31:53 crc kubenswrapper[4766]: I1209 03:31:53.463120 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" Dec 09 03:31:53 crc kubenswrapper[4766]: E1209 03:31:53.465469 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" podUID="50ba0df7-74df-4798-a0c0-39dda9c4e3ef" Dec 09 03:31:53 crc kubenswrapper[4766]: I1209 03:31:53.538390 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" Dec 09 03:31:53 crc kubenswrapper[4766]: E1209 03:31:53.540315 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" podUID="3467a8bf-3f6b-49ce-9e3a-5d834456bbaf" Dec 09 03:32:01 crc kubenswrapper[4766]: E1209 03:32:01.885304 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" podUID="c3606954-b281-4fa3-b96a-a62ab4092a78" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.081156 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" event={"ID":"d1aa5579-7f42-4ee2-af62-3b2a7d391101","Type":"ContainerStarted","Data":"70dbde862103623409eda95d7f1e4acd8b0ad2e4ea86d4664567fc3bf7912503"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.098867 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" event={"ID":"3aea0f89-2c0b-4705-92cf-17a37169675e","Type":"ContainerStarted","Data":"9c89f786a0b7dd20c843d8300b905b78ceee6c3e958395c870aa48c5ec49c8c5"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.102787 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.110903 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.118574 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" event={"ID":"4fa83023-eb1d-4ab9-b80a-b9bd04d342a3","Type":"ContainerStarted","Data":"2f8c44ce86d0756e784eb9378635202c0257b08a8879e8192bb2600319f0114e"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.119273 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.120475 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" event={"ID":"14e6b34f-8412-44ea-8342-0350ccf7f7c9","Type":"ContainerStarted","Data":"50f5cad680428f837fbbd6ad6dbfa1e74a202f8941ad4f3d7c955a487821ee60"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.121584 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" event={"ID":"636d7080-c310-4a2a-a07c-ef0aab6412ba","Type":"ContainerStarted","Data":"09edf6af140e3602aa396fdadddc5b2639adc7d630f89b62ee3d3c8d069d5ef5"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.122812 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" event={"ID":"ac56e057-6439-40ab-bb04-ce228a828444","Type":"ContainerStarted","Data":"eb254bc25e983555ce0bf96cb59fd85ef1bde731b668bdd25ef80fcfa54e394b"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.123355 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.133557 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.138650 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" event={"ID":"0d1a15a1-afd0-41a2-bf4d-21d98d4730b4","Type":"ContainerStarted","Data":"70cfcd30e5dbf0d434e4ce7ce9651f67e8eb88d67a54a5ddf52f863eb6cdeab8"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.161053 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" event={"ID":"e8d80517-4f1e-4123-b002-e7990cc9c945","Type":"ContainerStarted","Data":"8c24310ff33e76f89f047ba14546cc87f2d1036d4493201e409f50ae259c5c48"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.162306 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.162442 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-whnqg" podStartSLOduration=2.670511372 podStartE2EDuration="29.16242214s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.861463853 +0000 UTC m=+1176.570769279" lastFinishedPulling="2025-12-09 03:32:01.353374621 +0000 UTC m=+1203.062680047" observedRunningTime="2025-12-09 03:32:02.129656618 +0000 UTC m=+1203.838962044" watchObservedRunningTime="2025-12-09 03:32:02.16242214 +0000 UTC m=+1203.871727566" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.184291 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" event={"ID":"bb6ed96a-0045-42ea-a13e-dc4a82714b9d","Type":"ContainerStarted","Data":"804edeb97e5dbc035e3a42135f63ccca19d4072e93a3c5153bb221a286a7fa44"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.188815 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.195800 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.196614 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" podStartSLOduration=2.941389251 podStartE2EDuration="29.19660523s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.978990085 +0000 UTC m=+1176.688295511" lastFinishedPulling="2025-12-09 03:32:01.234206034 +0000 UTC m=+1202.943511490" observedRunningTime="2025-12-09 03:32:02.193946098 +0000 UTC m=+1203.903251524" watchObservedRunningTime="2025-12-09 03:32:02.19660523 +0000 UTC m=+1203.905910656" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.218745 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" event={"ID":"ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0","Type":"ContainerStarted","Data":"c9287ad89c8f9771ab87dc522d63e2a3cbf6071318d04a732b627204621b1701"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.219629 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.225470 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.229719 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.249843 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" event={"ID":"735b0ee4-6bb2-41f4-b6e9-494e5d73b584","Type":"ContainerStarted","Data":"c70e004b576f876cb2e2ee84fb673cf9ba6649f1a3d9c29c05afc50a24f1f37b"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.250805 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.256737 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.282404 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-vvdmw" podStartSLOduration=3.742719095 podStartE2EDuration="30.282390408s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.813595455 +0000 UTC m=+1176.522900881" lastFinishedPulling="2025-12-09 03:32:01.353266768 +0000 UTC m=+1203.062572194" observedRunningTime="2025-12-09 03:32:02.240525872 +0000 UTC m=+1203.949831288" watchObservedRunningTime="2025-12-09 03:32:02.282390408 +0000 UTC m=+1203.991695834" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.288262 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" event={"ID":"b6f43661-b796-4f31-aaa9-482f85952578","Type":"ContainerStarted","Data":"0834e97fbc755023498631019c73557551c18063f1ec081b4f663cd41cfe06f6"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.289060 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.293304 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.309822 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-ng459" podStartSLOduration=3.719689754 podStartE2EDuration="30.309808315s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.819416621 +0000 UTC m=+1176.528722047" lastFinishedPulling="2025-12-09 03:32:01.409535182 +0000 UTC m=+1203.118840608" observedRunningTime="2025-12-09 03:32:02.305587092 +0000 UTC m=+1204.014892518" watchObservedRunningTime="2025-12-09 03:32:02.309808315 +0000 UTC m=+1204.019113741" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.316507 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" event={"ID":"f911184d-8572-4bc3-abbd-b76245ce463c","Type":"ContainerStarted","Data":"912da972282574e6ef9541e4b37b53607362f17ac2966fefd89f45c39bbdacee"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.317321 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.327487 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.341406 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" event={"ID":"c3606954-b281-4fa3-b96a-a62ab4092a78","Type":"ContainerStarted","Data":"0b877658b7e5cc553a5bff321403b9b5a47c15f6ba11329412101600ac0aa5c0"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.386239 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" event={"ID":"146239d5-9360-4fa2-8a76-01743455b5f1","Type":"ContainerStarted","Data":"461bc584ed120e8b7fc582bc15a79f48c7395510e5d88804c2de34fab7bceb5c"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.389337 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.391150 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" event={"ID":"3b4cd02e-a82d-48d9-8078-9cdf3a65767c","Type":"ContainerStarted","Data":"b04dd2eb78ff1610cf7b98a88cd45d40caba0db6ea92c454e8a0341503bc94ba"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.391956 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.394809 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-p2sks" podStartSLOduration=3.05426476 podStartE2EDuration="30.394787772s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.011893474 +0000 UTC m=+1175.721198900" lastFinishedPulling="2025-12-09 03:32:01.352416486 +0000 UTC m=+1203.061721912" observedRunningTime="2025-12-09 03:32:02.369867922 +0000 UTC m=+1204.079173378" watchObservedRunningTime="2025-12-09 03:32:02.394787772 +0000 UTC m=+1204.104093208" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.405462 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.410970 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" event={"ID":"95b141d2-5fb4-46f6-b5af-92720c9be11c","Type":"ContainerStarted","Data":"00c9714ed44108dd55ae299f0dd2c3a6f6e76fd9cc1a5542995f91e6b625aa33"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.411604 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.421389 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-gc98f" podStartSLOduration=2.852164293 podStartE2EDuration="30.421366387s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:33.784131876 +0000 UTC m=+1175.493437302" lastFinishedPulling="2025-12-09 03:32:01.35333398 +0000 UTC m=+1203.062639396" observedRunningTime="2025-12-09 03:32:02.407587176 +0000 UTC m=+1204.116892602" watchObservedRunningTime="2025-12-09 03:32:02.421366387 +0000 UTC m=+1204.130671813" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.429489 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.447505 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6rt97" podStartSLOduration=3.931109474 podStartE2EDuration="30.44748454s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.828672951 +0000 UTC m=+1176.537978377" lastFinishedPulling="2025-12-09 03:32:01.345048017 +0000 UTC m=+1203.054353443" observedRunningTime="2025-12-09 03:32:02.438125648 +0000 UTC m=+1204.147431084" watchObservedRunningTime="2025-12-09 03:32:02.44748454 +0000 UTC m=+1204.156789966" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.453391 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" event={"ID":"581540bb-4d83-41b5-9384-e91238383025","Type":"ContainerStarted","Data":"454f517b29b5f856ff10bf922a371968126b58f9a05a4073a7d9d7f60e70bb37"} Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.477030 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5z662" podStartSLOduration=3.932501371 podStartE2EDuration="30.477016284s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.806560526 +0000 UTC m=+1176.515865952" lastFinishedPulling="2025-12-09 03:32:01.351075439 +0000 UTC m=+1203.060380865" observedRunningTime="2025-12-09 03:32:02.471251049 +0000 UTC m=+1204.180556475" watchObservedRunningTime="2025-12-09 03:32:02.477016284 +0000 UTC m=+1204.186321710" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.515194 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" podStartSLOduration=3.141498765 podStartE2EDuration="29.515180411s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.871593256 +0000 UTC m=+1176.580898682" lastFinishedPulling="2025-12-09 03:32:01.245274902 +0000 UTC m=+1202.954580328" observedRunningTime="2025-12-09 03:32:02.512795967 +0000 UTC m=+1204.222101393" watchObservedRunningTime="2025-12-09 03:32:02.515180411 +0000 UTC m=+1204.224485837" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.543387 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-shhtg" podStartSLOduration=4.017501758 podStartE2EDuration="30.54337011s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.813840721 +0000 UTC m=+1176.523146147" lastFinishedPulling="2025-12-09 03:32:01.339709073 +0000 UTC m=+1203.049014499" observedRunningTime="2025-12-09 03:32:02.542130166 +0000 UTC m=+1204.251435612" watchObservedRunningTime="2025-12-09 03:32:02.54337011 +0000 UTC m=+1204.252675536" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.573009 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-fk6b4" podStartSLOduration=3.174942113 podStartE2EDuration="29.572989136s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:35.010832612 +0000 UTC m=+1176.720138038" lastFinishedPulling="2025-12-09 03:32:01.408879635 +0000 UTC m=+1203.118185061" observedRunningTime="2025-12-09 03:32:02.564535989 +0000 UTC m=+1204.273841405" watchObservedRunningTime="2025-12-09 03:32:02.572989136 +0000 UTC m=+1204.282294562" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.643812 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-89h67" podStartSLOduration=4.123779037 podStartE2EDuration="30.643771031s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.819062112 +0000 UTC m=+1176.528367548" lastFinishedPulling="2025-12-09 03:32:01.339054116 +0000 UTC m=+1203.048359542" observedRunningTime="2025-12-09 03:32:02.635823587 +0000 UTC m=+1204.345129013" watchObservedRunningTime="2025-12-09 03:32:02.643771031 +0000 UTC m=+1204.353076457" Dec 09 03:32:02 crc kubenswrapper[4766]: I1209 03:32:02.686850 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dxctt" podStartSLOduration=3.215014952 podStartE2EDuration="29.68683277s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.891974794 +0000 UTC m=+1176.601280220" lastFinishedPulling="2025-12-09 03:32:01.363792612 +0000 UTC m=+1203.073098038" observedRunningTime="2025-12-09 03:32:02.684434585 +0000 UTC m=+1204.393740011" watchObservedRunningTime="2025-12-09 03:32:02.68683277 +0000 UTC m=+1204.396138196" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.461769 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" event={"ID":"0d1a15a1-afd0-41a2-bf4d-21d98d4730b4","Type":"ContainerStarted","Data":"3cad504c1040da7b9d539566717ed5f1e8559d5df8d48fd7d9ae1862f1af91ba"} Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.462127 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.463768 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" event={"ID":"4fa83023-eb1d-4ab9-b80a-b9bd04d342a3","Type":"ContainerStarted","Data":"9d63b680cda8ddf68b76d142b569215776e9b832c717bb41eb13df7a3b0f18ff"} Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.466046 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" event={"ID":"d1aa5579-7f42-4ee2-af62-3b2a7d391101","Type":"ContainerStarted","Data":"d57e835a53e9c1d25c8d0cbac1d47ed84c45077b5fbea0a4770e8f7e01cc7ffc"} Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.466107 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.468464 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" event={"ID":"3b4cd02e-a82d-48d9-8078-9cdf3a65767c","Type":"ContainerStarted","Data":"6d25dbeab0546a585e6c997d757f06fe8cde5a093d5ca1d568605f4e9a9aeb03"} Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.472300 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" event={"ID":"14e6b34f-8412-44ea-8342-0350ccf7f7c9","Type":"ContainerStarted","Data":"f46ab9e81b3825c7f7f4412387653d34361133dd7bee360375fe8bf2e7fb0b3c"} Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.472437 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.474430 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" event={"ID":"c3606954-b281-4fa3-b96a-a62ab4092a78","Type":"ContainerStarted","Data":"8bde737dc16f00c79ec1d24e26502c4cf78bf6ef7deb1c93e30d95d8b108e7bf"} Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.474612 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.476293 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" event={"ID":"636d7080-c310-4a2a-a07c-ef0aab6412ba","Type":"ContainerStarted","Data":"be5c178ecf9c88a379d7d7930e57e3988648a1d9b5490bb768d2e6c7f49bb871"} Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.489836 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" podStartSLOduration=4.232935261 podStartE2EDuration="30.489821246s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.978647576 +0000 UTC m=+1176.687953002" lastFinishedPulling="2025-12-09 03:32:01.235533541 +0000 UTC m=+1202.944838987" observedRunningTime="2025-12-09 03:32:03.482633002 +0000 UTC m=+1205.191938448" watchObservedRunningTime="2025-12-09 03:32:03.489821246 +0000 UTC m=+1205.199126672" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.529994 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" podStartSLOduration=4.273597444 podStartE2EDuration="30.529977136s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.978733868 +0000 UTC m=+1176.688039294" lastFinishedPulling="2025-12-09 03:32:01.23511356 +0000 UTC m=+1202.944418986" observedRunningTime="2025-12-09 03:32:03.510370408 +0000 UTC m=+1205.219675854" watchObservedRunningTime="2025-12-09 03:32:03.529977136 +0000 UTC m=+1205.239282562" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.530454 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" podStartSLOduration=4.164196381 podStartE2EDuration="30.530450998s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.86802999 +0000 UTC m=+1176.577335416" lastFinishedPulling="2025-12-09 03:32:01.234284607 +0000 UTC m=+1202.943590033" observedRunningTime="2025-12-09 03:32:03.528185797 +0000 UTC m=+1205.237491223" watchObservedRunningTime="2025-12-09 03:32:03.530450998 +0000 UTC m=+1205.239756424" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.547263 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" podStartSLOduration=4.181706852 podStartE2EDuration="30.547206849s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.879825347 +0000 UTC m=+1176.589130773" lastFinishedPulling="2025-12-09 03:32:01.245325344 +0000 UTC m=+1202.954630770" observedRunningTime="2025-12-09 03:32:03.541115095 +0000 UTC m=+1205.250420521" watchObservedRunningTime="2025-12-09 03:32:03.547206849 +0000 UTC m=+1205.256512275" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.558178 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" podStartSLOduration=2.524180325 podStartE2EDuration="30.558160904s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.848959137 +0000 UTC m=+1176.558264563" lastFinishedPulling="2025-12-09 03:32:02.882939706 +0000 UTC m=+1204.592245142" observedRunningTime="2025-12-09 03:32:03.556743666 +0000 UTC m=+1205.266049102" watchObservedRunningTime="2025-12-09 03:32:03.558160904 +0000 UTC m=+1205.267466330" Dec 09 03:32:03 crc kubenswrapper[4766]: I1209 03:32:03.856043 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" Dec 09 03:32:04 crc kubenswrapper[4766]: I1209 03:32:04.854467 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:32:04 crc kubenswrapper[4766]: I1209 03:32:04.869298 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef10710a-81c7-46f4-8c5c-3aabdc02a833-cert\") pod \"infra-operator-controller-manager-78d48bff9d-sn9z2\" (UID: \"ef10710a-81c7-46f4-8c5c-3aabdc02a833\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.135735 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9497p" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.144870 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.158168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.162927 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2580526-5f4a-4340-9856-bc78a320e610-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fbrcdj\" (UID: \"b2580526-5f4a-4340-9856-bc78a320e610\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.242071 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7plqd" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.250203 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.621960 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2"] Dec 09 03:32:05 crc kubenswrapper[4766]: W1209 03:32:05.624625 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef10710a_81c7_46f4_8c5c_3aabdc02a833.slice/crio-3f0e4b223219d3b3f1baaef2f24d00870787403b3d9dbf337e59437a80585c24 WatchSource:0}: Error finding container 3f0e4b223219d3b3f1baaef2f24d00870787403b3d9dbf337e59437a80585c24: Status 404 returned error can't find the container with id 3f0e4b223219d3b3f1baaef2f24d00870787403b3d9dbf337e59437a80585c24 Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.696434 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj"] Dec 09 03:32:05 crc kubenswrapper[4766]: W1209 03:32:05.698042 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2580526_5f4a_4340_9856_bc78a320e610.slice/crio-042d842bf63912e92ed3ed2de2e16f005fdd5749ff12fbc1dde9f17d1faa9801 WatchSource:0}: Error finding container 042d842bf63912e92ed3ed2de2e16f005fdd5749ff12fbc1dde9f17d1faa9801: Status 404 returned error can't find the container with id 042d842bf63912e92ed3ed2de2e16f005fdd5749ff12fbc1dde9f17d1faa9801 Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.868827 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.868930 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.876328 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-metrics-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:32:05 crc kubenswrapper[4766]: I1209 03:32:05.877195 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a315c06a-893b-4f1b-9b0e-0120afcfeb00-webhook-certs\") pod \"openstack-operator-controller-manager-646dd6f965-mkjpj\" (UID: \"a315c06a-893b-4f1b-9b0e-0120afcfeb00\") " pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:32:06 crc kubenswrapper[4766]: I1209 03:32:06.000303 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5l4lg" Dec 09 03:32:06 crc kubenswrapper[4766]: I1209 03:32:06.008463 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:32:06 crc kubenswrapper[4766]: I1209 03:32:06.446460 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj"] Dec 09 03:32:06 crc kubenswrapper[4766]: W1209 03:32:06.452486 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda315c06a_893b_4f1b_9b0e_0120afcfeb00.slice/crio-506450b5aa9ef3c884732cf599824efe19a534bde6e5c595f6a0ef0b77a1b709 WatchSource:0}: Error finding container 506450b5aa9ef3c884732cf599824efe19a534bde6e5c595f6a0ef0b77a1b709: Status 404 returned error can't find the container with id 506450b5aa9ef3c884732cf599824efe19a534bde6e5c595f6a0ef0b77a1b709 Dec 09 03:32:06 crc kubenswrapper[4766]: I1209 03:32:06.501374 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" event={"ID":"a315c06a-893b-4f1b-9b0e-0120afcfeb00","Type":"ContainerStarted","Data":"506450b5aa9ef3c884732cf599824efe19a534bde6e5c595f6a0ef0b77a1b709"} Dec 09 03:32:06 crc kubenswrapper[4766]: I1209 03:32:06.502385 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" event={"ID":"ef10710a-81c7-46f4-8c5c-3aabdc02a833","Type":"ContainerStarted","Data":"3f0e4b223219d3b3f1baaef2f24d00870787403b3d9dbf337e59437a80585c24"} Dec 09 03:32:06 crc kubenswrapper[4766]: I1209 03:32:06.503436 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" event={"ID":"b2580526-5f4a-4340-9856-bc78a320e610","Type":"ContainerStarted","Data":"042d842bf63912e92ed3ed2de2e16f005fdd5749ff12fbc1dde9f17d1faa9801"} Dec 09 03:32:07 crc kubenswrapper[4766]: I1209 03:32:07.511352 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" event={"ID":"50ba0df7-74df-4798-a0c0-39dda9c4e3ef","Type":"ContainerStarted","Data":"d69ea8df5bc09007c32409dfd48eadbb4e0994e3c06293e2e201892357bacd67"} Dec 09 03:32:07 crc kubenswrapper[4766]: I1209 03:32:07.513988 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" event={"ID":"a315c06a-893b-4f1b-9b0e-0120afcfeb00","Type":"ContainerStarted","Data":"d13fe55237e1fc586eaba9cbff58fc95dca0dab3f494c7530a5e490404ce6877"} Dec 09 03:32:07 crc kubenswrapper[4766]: I1209 03:32:07.514287 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:32:07 crc kubenswrapper[4766]: I1209 03:32:07.533561 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-7cdxh" podStartSLOduration=22.639443504 podStartE2EDuration="35.533524166s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.843548821 +0000 UTC m=+1176.552854247" lastFinishedPulling="2025-12-09 03:31:47.737629473 +0000 UTC m=+1189.446934909" observedRunningTime="2025-12-09 03:32:07.526355954 +0000 UTC m=+1209.235661380" watchObservedRunningTime="2025-12-09 03:32:07.533524166 +0000 UTC m=+1209.242829592" Dec 09 03:32:07 crc kubenswrapper[4766]: I1209 03:32:07.596235 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" podStartSLOduration=34.596202693 podStartE2EDuration="34.596202693s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:32:07.591808964 +0000 UTC m=+1209.301114400" watchObservedRunningTime="2025-12-09 03:32:07.596202693 +0000 UTC m=+1209.305508119" Dec 09 03:32:08 crc kubenswrapper[4766]: I1209 03:32:08.525160 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" event={"ID":"ef10710a-81c7-46f4-8c5c-3aabdc02a833","Type":"ContainerStarted","Data":"7c7a05f800011295432b7e561069556c1a74d97d29ded8b152bd60a06f3b9c91"} Dec 09 03:32:08 crc kubenswrapper[4766]: I1209 03:32:08.528718 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" event={"ID":"3467a8bf-3f6b-49ce-9e3a-5d834456bbaf","Type":"ContainerStarted","Data":"6353c9f4c434fb9410dc695bd49fe96beeafaacbb83a69e623bf49acac2b633b"} Dec 09 03:32:08 crc kubenswrapper[4766]: I1209 03:32:08.550077 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-mwbr2" podStartSLOduration=23.636284135 podStartE2EDuration="36.550056537s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:31:34.819174885 +0000 UTC m=+1176.528480311" lastFinishedPulling="2025-12-09 03:31:47.732947277 +0000 UTC m=+1189.442252713" observedRunningTime="2025-12-09 03:32:08.544369764 +0000 UTC m=+1210.253675210" watchObservedRunningTime="2025-12-09 03:32:08.550056537 +0000 UTC m=+1210.259361983" Dec 09 03:32:09 crc kubenswrapper[4766]: I1209 03:32:09.542986 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" event={"ID":"b2580526-5f4a-4340-9856-bc78a320e610","Type":"ContainerStarted","Data":"f80a0543152967b825fa68249953f7e89de5418a14870fe1b12ff61583b2db53"} Dec 09 03:32:09 crc kubenswrapper[4766]: I1209 03:32:09.543064 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" event={"ID":"b2580526-5f4a-4340-9856-bc78a320e610","Type":"ContainerStarted","Data":"fce61afe26a83c4363f61da3c1a24296704e014f276be03fb0f646e278b1fa0a"} Dec 09 03:32:09 crc kubenswrapper[4766]: I1209 03:32:09.545447 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" event={"ID":"ef10710a-81c7-46f4-8c5c-3aabdc02a833","Type":"ContainerStarted","Data":"7616cac6d61ef863aa499d69539df9d6ab81bf2dfda9bcd8b129466302382e82"} Dec 09 03:32:09 crc kubenswrapper[4766]: I1209 03:32:09.545944 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:32:09 crc kubenswrapper[4766]: I1209 03:32:09.584451 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" podStartSLOduration=33.976453518 podStartE2EDuration="36.584418308s" podCreationTimestamp="2025-12-09 03:31:33 +0000 UTC" firstStartedPulling="2025-12-09 03:32:05.700124657 +0000 UTC m=+1207.409430083" lastFinishedPulling="2025-12-09 03:32:08.308089437 +0000 UTC m=+1210.017394873" observedRunningTime="2025-12-09 03:32:09.57260577 +0000 UTC m=+1211.281911206" watchObservedRunningTime="2025-12-09 03:32:09.584418308 +0000 UTC m=+1211.293723764" Dec 09 03:32:09 crc kubenswrapper[4766]: I1209 03:32:09.596179 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" podStartSLOduration=34.918436435 podStartE2EDuration="37.596160293s" podCreationTimestamp="2025-12-09 03:31:32 +0000 UTC" firstStartedPulling="2025-12-09 03:32:05.626402002 +0000 UTC m=+1207.335707448" lastFinishedPulling="2025-12-09 03:32:08.30412588 +0000 UTC m=+1210.013431306" observedRunningTime="2025-12-09 03:32:09.59117174 +0000 UTC m=+1211.300477166" watchObservedRunningTime="2025-12-09 03:32:09.596160293 +0000 UTC m=+1211.305465719" Dec 09 03:32:10 crc kubenswrapper[4766]: I1209 03:32:10.556527 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:32:13 crc kubenswrapper[4766]: I1209 03:32:13.583257 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-qfdx4" Dec 09 03:32:13 crc kubenswrapper[4766]: I1209 03:32:13.620843 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-rb9kw" Dec 09 03:32:13 crc kubenswrapper[4766]: I1209 03:32:13.785155 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-ln78g" Dec 09 03:32:13 crc kubenswrapper[4766]: I1209 03:32:13.854396 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-dgds5" Dec 09 03:32:13 crc kubenswrapper[4766]: I1209 03:32:13.857578 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rllsr" Dec 09 03:32:14 crc kubenswrapper[4766]: I1209 03:32:14.056270 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-62wlz" Dec 09 03:32:14 crc kubenswrapper[4766]: I1209 03:32:14.121391 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-74tq4" Dec 09 03:32:15 crc kubenswrapper[4766]: I1209 03:32:15.151520 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-sn9z2" Dec 09 03:32:15 crc kubenswrapper[4766]: I1209 03:32:15.255671 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fbrcdj" Dec 09 03:32:16 crc kubenswrapper[4766]: I1209 03:32:16.014867 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-646dd6f965-mkjpj" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.081092 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cpkpv"] Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.082751 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.088700 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jtjxm" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.088951 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.089248 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.090592 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.098088 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cpkpv"] Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.140560 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kr5sq"] Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.142766 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.147035 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.148712 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kr5sq"] Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.193968 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqc96\" (UniqueName: \"kubernetes.io/projected/079f2043-c863-48cb-b833-cee2091fcdc7-kube-api-access-bqc96\") pod \"dnsmasq-dns-675f4bcbfc-cpkpv\" (UID: \"079f2043-c863-48cb-b833-cee2091fcdc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.194050 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079f2043-c863-48cb-b833-cee2091fcdc7-config\") pod \"dnsmasq-dns-675f4bcbfc-cpkpv\" (UID: \"079f2043-c863-48cb-b833-cee2091fcdc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.295254 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqc96\" (UniqueName: \"kubernetes.io/projected/079f2043-c863-48cb-b833-cee2091fcdc7-kube-api-access-bqc96\") pod \"dnsmasq-dns-675f4bcbfc-cpkpv\" (UID: \"079f2043-c863-48cb-b833-cee2091fcdc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.295302 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-config\") pod \"dnsmasq-dns-78dd6ddcc-kr5sq\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.295357 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-kr5sq\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.295380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079f2043-c863-48cb-b833-cee2091fcdc7-config\") pod \"dnsmasq-dns-675f4bcbfc-cpkpv\" (UID: \"079f2043-c863-48cb-b833-cee2091fcdc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.295402 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/231db44a-48fc-4dd4-95b3-76cea8a3ef13-kube-api-access-b5wv5\") pod \"dnsmasq-dns-78dd6ddcc-kr5sq\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.296272 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079f2043-c863-48cb-b833-cee2091fcdc7-config\") pod \"dnsmasq-dns-675f4bcbfc-cpkpv\" (UID: \"079f2043-c863-48cb-b833-cee2091fcdc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.322262 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqc96\" (UniqueName: \"kubernetes.io/projected/079f2043-c863-48cb-b833-cee2091fcdc7-kube-api-access-bqc96\") pod \"dnsmasq-dns-675f4bcbfc-cpkpv\" (UID: \"079f2043-c863-48cb-b833-cee2091fcdc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.396565 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-config\") pod \"dnsmasq-dns-78dd6ddcc-kr5sq\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.396652 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-kr5sq\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.396695 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/231db44a-48fc-4dd4-95b3-76cea8a3ef13-kube-api-access-b5wv5\") pod \"dnsmasq-dns-78dd6ddcc-kr5sq\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.397935 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-config\") pod \"dnsmasq-dns-78dd6ddcc-kr5sq\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.398070 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-kr5sq\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.398285 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.415333 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/231db44a-48fc-4dd4-95b3-76cea8a3ef13-kube-api-access-b5wv5\") pod \"dnsmasq-dns-78dd6ddcc-kr5sq\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.465448 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.839243 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cpkpv"] Dec 09 03:32:33 crc kubenswrapper[4766]: W1209 03:32:33.849089 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod079f2043_c863_48cb_b833_cee2091fcdc7.slice/crio-4c8704b17962fa34aa18ded663a43aafe5ca6fe34083143c44fb17daff9ab6a8 WatchSource:0}: Error finding container 4c8704b17962fa34aa18ded663a43aafe5ca6fe34083143c44fb17daff9ab6a8: Status 404 returned error can't find the container with id 4c8704b17962fa34aa18ded663a43aafe5ca6fe34083143c44fb17daff9ab6a8 Dec 09 03:32:33 crc kubenswrapper[4766]: I1209 03:32:33.946866 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kr5sq"] Dec 09 03:32:33 crc kubenswrapper[4766]: W1209 03:32:33.950810 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod231db44a_48fc_4dd4_95b3_76cea8a3ef13.slice/crio-da1f22d8f30655273e53c21bf9efafba63ae4f4d85077676969218cd534221b8 WatchSource:0}: Error finding container da1f22d8f30655273e53c21bf9efafba63ae4f4d85077676969218cd534221b8: Status 404 returned error can't find the container with id da1f22d8f30655273e53c21bf9efafba63ae4f4d85077676969218cd534221b8 Dec 09 03:32:34 crc kubenswrapper[4766]: I1209 03:32:34.747670 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" event={"ID":"231db44a-48fc-4dd4-95b3-76cea8a3ef13","Type":"ContainerStarted","Data":"da1f22d8f30655273e53c21bf9efafba63ae4f4d85077676969218cd534221b8"} Dec 09 03:32:34 crc kubenswrapper[4766]: I1209 03:32:34.749124 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" event={"ID":"079f2043-c863-48cb-b833-cee2091fcdc7","Type":"ContainerStarted","Data":"4c8704b17962fa34aa18ded663a43aafe5ca6fe34083143c44fb17daff9ab6a8"} Dec 09 03:32:35 crc kubenswrapper[4766]: I1209 03:32:35.761530 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cpkpv"] Dec 09 03:32:35 crc kubenswrapper[4766]: I1209 03:32:35.788746 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-k8khs"] Dec 09 03:32:35 crc kubenswrapper[4766]: I1209 03:32:35.789844 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:35 crc kubenswrapper[4766]: I1209 03:32:35.803714 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-k8khs"] Dec 09 03:32:35 crc kubenswrapper[4766]: I1209 03:32:35.935418 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-k8khs\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:35 crc kubenswrapper[4766]: I1209 03:32:35.935483 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxcj6\" (UniqueName: \"kubernetes.io/projected/cb455f12-00ad-41c6-8035-4384770f42ed-kube-api-access-bxcj6\") pod \"dnsmasq-dns-5ccc8479f9-k8khs\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:35 crc kubenswrapper[4766]: I1209 03:32:35.935540 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-config\") pod \"dnsmasq-dns-5ccc8479f9-k8khs\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.024860 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kr5sq"] Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.037178 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-config\") pod \"dnsmasq-dns-5ccc8479f9-k8khs\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.037331 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-k8khs\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.037367 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxcj6\" (UniqueName: \"kubernetes.io/projected/cb455f12-00ad-41c6-8035-4384770f42ed-kube-api-access-bxcj6\") pod \"dnsmasq-dns-5ccc8479f9-k8khs\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.038675 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-config\") pod \"dnsmasq-dns-5ccc8479f9-k8khs\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.038965 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-k8khs\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.049708 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-88pjz"] Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.050821 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.068629 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-88pjz"] Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.073991 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxcj6\" (UniqueName: \"kubernetes.io/projected/cb455f12-00ad-41c6-8035-4384770f42ed-kube-api-access-bxcj6\") pod \"dnsmasq-dns-5ccc8479f9-k8khs\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.109162 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.142497 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc754\" (UniqueName: \"kubernetes.io/projected/44ec5489-8c07-4223-875f-d0e43796ff74-kube-api-access-tc754\") pod \"dnsmasq-dns-57d769cc4f-88pjz\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.142566 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-config\") pod \"dnsmasq-dns-57d769cc4f-88pjz\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.142618 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-88pjz\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.259190 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc754\" (UniqueName: \"kubernetes.io/projected/44ec5489-8c07-4223-875f-d0e43796ff74-kube-api-access-tc754\") pod \"dnsmasq-dns-57d769cc4f-88pjz\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.259498 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-config\") pod \"dnsmasq-dns-57d769cc4f-88pjz\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.259539 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-88pjz\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.260360 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-config\") pod \"dnsmasq-dns-57d769cc4f-88pjz\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.260424 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-88pjz\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.278524 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc754\" (UniqueName: \"kubernetes.io/projected/44ec5489-8c07-4223-875f-d0e43796ff74-kube-api-access-tc754\") pod \"dnsmasq-dns-57d769cc4f-88pjz\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.364546 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.421432 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-k8khs"] Dec 09 03:32:36 crc kubenswrapper[4766]: W1209 03:32:36.438781 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb455f12_00ad_41c6_8035_4384770f42ed.slice/crio-00d8a15fd3e007d2f91335fbc724959eac41e69d83c1c39a425a57da8ab410a1 WatchSource:0}: Error finding container 00d8a15fd3e007d2f91335fbc724959eac41e69d83c1c39a425a57da8ab410a1: Status 404 returned error can't find the container with id 00d8a15fd3e007d2f91335fbc724959eac41e69d83c1c39a425a57da8ab410a1 Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.764879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" event={"ID":"cb455f12-00ad-41c6-8035-4384770f42ed","Type":"ContainerStarted","Data":"00d8a15fd3e007d2f91335fbc724959eac41e69d83c1c39a425a57da8ab410a1"} Dec 09 03:32:36 crc kubenswrapper[4766]: I1209 03:32:36.787382 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-88pjz"] Dec 09 03:32:36 crc kubenswrapper[4766]: W1209 03:32:36.794358 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44ec5489_8c07_4223_875f_d0e43796ff74.slice/crio-39f124cdc3fd29a941802cdd5439d8093a5290336cf9c822ea1cf1dd3883219d WatchSource:0}: Error finding container 39f124cdc3fd29a941802cdd5439d8093a5290336cf9c822ea1cf1dd3883219d: Status 404 returned error can't find the container with id 39f124cdc3fd29a941802cdd5439d8093a5290336cf9c822ea1cf1dd3883219d Dec 09 03:32:37 crc kubenswrapper[4766]: I1209 03:32:37.773260 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" event={"ID":"44ec5489-8c07-4223-875f-d0e43796ff74","Type":"ContainerStarted","Data":"39f124cdc3fd29a941802cdd5439d8093a5290336cf9c822ea1cf1dd3883219d"} Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.597349 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.598778 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.601516 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.601751 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.601995 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.602140 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.602363 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.602821 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.603047 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tctqt" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.610867 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.612475 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.615418 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.615714 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x2tp6" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.615833 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.615973 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.617289 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.617478 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.617639 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.621757 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.626082 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.675808 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.677954 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.689373 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.689918 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.691591 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-g8hnh" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.691721 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.696569 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.699782 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722361 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722402 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722423 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722441 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gn76\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-kube-api-access-9gn76\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722468 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722484 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722508 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722542 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qmk\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-kube-api-access-z9qmk\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722563 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722582 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722598 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722629 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48862672-08e2-4ac6-86a3-57d84bbc868d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722648 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48862672-08e2-4ac6-86a3-57d84bbc868d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722681 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722697 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722711 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722727 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722747 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722767 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722789 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722807 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.722825 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.823986 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824019 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824041 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gn76\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-kube-api-access-9gn76\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824062 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824090 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824108 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824130 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824154 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktq8\" (UniqueName: \"kubernetes.io/projected/26d1d344-fbf5-415d-952e-9ee50493a134-kube-api-access-bktq8\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824184 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qmk\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-kube-api-access-z9qmk\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824206 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824243 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824260 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-default\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824278 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824291 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48862672-08e2-4ac6-86a3-57d84bbc868d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824309 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48862672-08e2-4ac6-86a3-57d84bbc868d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824334 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824349 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-operator-scripts\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824364 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824388 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824405 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824427 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824443 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824463 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-kolla-config\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824480 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824499 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-generated\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824516 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824530 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824545 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824564 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824579 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.824865 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.826570 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.826846 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.827034 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.827087 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.827864 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.828151 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.828385 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.828685 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.828783 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.829638 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.832386 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48862672-08e2-4ac6-86a3-57d84bbc868d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.833490 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.842640 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.844083 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48862672-08e2-4ac6-86a3-57d84bbc868d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.844505 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.850064 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.857107 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.859931 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qmk\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-kube-api-access-z9qmk\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.860778 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gn76\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-kube-api-access-9gn76\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.862615 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.866255 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.872872 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.931069 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.934164 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktq8\" (UniqueName: \"kubernetes.io/projected/26d1d344-fbf5-415d-952e-9ee50493a134-kube-api-access-bktq8\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.934313 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-default\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.934415 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-operator-scripts\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.934508 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-kolla-config\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.934574 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.934647 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-generated\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.941794 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.942150 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.942491 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.938790 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-default\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.944483 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " pod="openstack/rabbitmq-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.956783 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-kolla-config\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.961826 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-operator-scripts\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.962081 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-generated\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.988289 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.990124 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:38 crc kubenswrapper[4766]: I1209 03:32:38.995908 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktq8\" (UniqueName: \"kubernetes.io/projected/26d1d344-fbf5-415d-952e-9ee50493a134-kube-api-access-bktq8\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:39 crc kubenswrapper[4766]: I1209 03:32:39.011707 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 03:32:39 crc kubenswrapper[4766]: I1209 03:32:39.029366 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " pod="openstack/openstack-galera-0" Dec 09 03:32:39 crc kubenswrapper[4766]: I1209 03:32:39.314137 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 03:32:39 crc kubenswrapper[4766]: I1209 03:32:39.523047 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 03:32:39 crc kubenswrapper[4766]: W1209 03:32:39.530437 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af438c1_d0b9_4ecb_bb88_a0efd14736a4.slice/crio-8b97799afc465daad45141bb9b70082bb279b1eac37bfa687c4d9ed8695a8f71 WatchSource:0}: Error finding container 8b97799afc465daad45141bb9b70082bb279b1eac37bfa687c4d9ed8695a8f71: Status 404 returned error can't find the container with id 8b97799afc465daad45141bb9b70082bb279b1eac37bfa687c4d9ed8695a8f71 Dec 09 03:32:39 crc kubenswrapper[4766]: I1209 03:32:39.579939 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 03:32:39 crc kubenswrapper[4766]: I1209 03:32:39.808175 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48862672-08e2-4ac6-86a3-57d84bbc868d","Type":"ContainerStarted","Data":"599f1cb825e907d0488c9144a6849ea3cfe74d4ef43106b360d7bce05eaa8d65"} Dec 09 03:32:39 crc kubenswrapper[4766]: I1209 03:32:39.816298 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3af438c1-d0b9-4ecb-bb88-a0efd14736a4","Type":"ContainerStarted","Data":"8b97799afc465daad45141bb9b70082bb279b1eac37bfa687c4d9ed8695a8f71"} Dec 09 03:32:39 crc kubenswrapper[4766]: I1209 03:32:39.816345 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 03:32:39 crc kubenswrapper[4766]: W1209 03:32:39.821127 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d1d344_fbf5_415d_952e_9ee50493a134.slice/crio-19b739acbecbbb03849c91514c71d12e40e3ff07ece2edc6b168f1df0dc03926 WatchSource:0}: Error finding container 19b739acbecbbb03849c91514c71d12e40e3ff07ece2edc6b168f1df0dc03926: Status 404 returned error can't find the container with id 19b739acbecbbb03849c91514c71d12e40e3ff07ece2edc6b168f1df0dc03926 Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.105278 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.106874 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.109931 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.110240 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.110431 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.110577 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-k6952" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.124972 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.261015 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.262131 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.262187 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.262224 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.262250 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.262274 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.262297 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvgc8\" (UniqueName: \"kubernetes.io/projected/a57927d7-7099-4b87-99ee-77aa589cd09f-kube-api-access-wvgc8\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.262329 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.262353 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.262613 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.267399 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.267414 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-msw75" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.267609 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.297043 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363495 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gntm\" (UniqueName: \"kubernetes.io/projected/01ada9c6-91af-4717-a157-29070bf61a6e-kube-api-access-4gntm\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363545 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363567 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363591 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363636 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363668 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363687 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363707 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-kolla-config\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363728 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363749 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363772 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvgc8\" (UniqueName: \"kubernetes.io/projected/a57927d7-7099-4b87-99ee-77aa589cd09f-kube-api-access-wvgc8\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363797 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.363818 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-config-data\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.364443 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.365065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.365746 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.367081 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.368671 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.372640 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.376726 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.385619 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvgc8\" (UniqueName: \"kubernetes.io/projected/a57927d7-7099-4b87-99ee-77aa589cd09f-kube-api-access-wvgc8\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.415735 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.446227 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.465593 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-kolla-config\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.465688 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.465715 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-config-data\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.465743 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gntm\" (UniqueName: \"kubernetes.io/projected/01ada9c6-91af-4717-a157-29070bf61a6e-kube-api-access-4gntm\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.465771 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.466959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-kolla-config\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.466968 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-config-data\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.477236 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.477893 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.497833 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gntm\" (UniqueName: \"kubernetes.io/projected/01ada9c6-91af-4717-a157-29070bf61a6e-kube-api-access-4gntm\") pod \"memcached-0\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.588518 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 03:32:40 crc kubenswrapper[4766]: I1209 03:32:40.831424 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26d1d344-fbf5-415d-952e-9ee50493a134","Type":"ContainerStarted","Data":"19b739acbecbbb03849c91514c71d12e40e3ff07ece2edc6b168f1df0dc03926"} Dec 09 03:32:41 crc kubenswrapper[4766]: I1209 03:32:41.008736 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 03:32:41 crc kubenswrapper[4766]: I1209 03:32:41.294832 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 03:32:41 crc kubenswrapper[4766]: I1209 03:32:41.857323 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"01ada9c6-91af-4717-a157-29070bf61a6e","Type":"ContainerStarted","Data":"6bf1c9777693a010170643b0ca7f092e72290fe3f862ff533b489ecae3969912"} Dec 09 03:32:41 crc kubenswrapper[4766]: I1209 03:32:41.870280 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a57927d7-7099-4b87-99ee-77aa589cd09f","Type":"ContainerStarted","Data":"137fa822e08a33698279473229d907147e31d90bbc7c9e4dcb25c748c6933d73"} Dec 09 03:32:41 crc kubenswrapper[4766]: I1209 03:32:41.957031 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:32:41 crc kubenswrapper[4766]: I1209 03:32:41.957988 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 03:32:41 crc kubenswrapper[4766]: I1209 03:32:41.960690 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-kqzsn" Dec 09 03:32:41 crc kubenswrapper[4766]: I1209 03:32:41.968978 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:32:42 crc kubenswrapper[4766]: I1209 03:32:42.099308 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vlxg\" (UniqueName: \"kubernetes.io/projected/292bc9d3-3333-4974-93c2-966d76dfa582-kube-api-access-9vlxg\") pod \"kube-state-metrics-0\" (UID: \"292bc9d3-3333-4974-93c2-966d76dfa582\") " pod="openstack/kube-state-metrics-0" Dec 09 03:32:42 crc kubenswrapper[4766]: I1209 03:32:42.200883 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vlxg\" (UniqueName: \"kubernetes.io/projected/292bc9d3-3333-4974-93c2-966d76dfa582-kube-api-access-9vlxg\") pod \"kube-state-metrics-0\" (UID: \"292bc9d3-3333-4974-93c2-966d76dfa582\") " pod="openstack/kube-state-metrics-0" Dec 09 03:32:42 crc kubenswrapper[4766]: I1209 03:32:42.221155 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vlxg\" (UniqueName: \"kubernetes.io/projected/292bc9d3-3333-4974-93c2-966d76dfa582-kube-api-access-9vlxg\") pod \"kube-state-metrics-0\" (UID: \"292bc9d3-3333-4974-93c2-966d76dfa582\") " pod="openstack/kube-state-metrics-0" Dec 09 03:32:42 crc kubenswrapper[4766]: I1209 03:32:42.290536 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 03:32:42 crc kubenswrapper[4766]: W1209 03:32:42.983692 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod292bc9d3_3333_4974_93c2_966d76dfa582.slice/crio-61be5ff755701f82e78defc00960cb2d6e0ad703db6bd196417fc6a08a1d710a WatchSource:0}: Error finding container 61be5ff755701f82e78defc00960cb2d6e0ad703db6bd196417fc6a08a1d710a: Status 404 returned error can't find the container with id 61be5ff755701f82e78defc00960cb2d6e0ad703db6bd196417fc6a08a1d710a Dec 09 03:32:42 crc kubenswrapper[4766]: I1209 03:32:42.997348 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:32:43 crc kubenswrapper[4766]: I1209 03:32:43.888878 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"292bc9d3-3333-4974-93c2-966d76dfa582","Type":"ContainerStarted","Data":"61be5ff755701f82e78defc00960cb2d6e0ad703db6bd196417fc6a08a1d710a"} Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.324495 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s25fz"] Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.325497 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.331684 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xsg8r" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.331979 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.332104 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.332909 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-h9kbs"] Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.334471 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.339367 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s25fz"] Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.345780 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-h9kbs"] Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465282 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-combined-ca-bundle\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465354 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-lib\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465414 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run-ovn\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465433 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lwfk\" (UniqueName: \"kubernetes.io/projected/f28c984f-04eb-4398-af98-9e2c5e6afd13-kube-api-access-9lwfk\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465451 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-etc-ovs\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465495 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465525 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-ovn-controller-tls-certs\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465564 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-run\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465658 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z894b\" (UniqueName: \"kubernetes.io/projected/149434b0-ace1-4e8f-9be4-76eb650f7c7f-kube-api-access-z894b\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465678 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/149434b0-ace1-4e8f-9be4-76eb650f7c7f-scripts\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465694 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-log-ovn\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465728 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28c984f-04eb-4398-af98-9e2c5e6afd13-scripts\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.465744 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-log\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567297 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-log\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567358 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-combined-ca-bundle\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567392 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-lib\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567419 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run-ovn\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567434 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lwfk\" (UniqueName: \"kubernetes.io/projected/f28c984f-04eb-4398-af98-9e2c5e6afd13-kube-api-access-9lwfk\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567454 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-etc-ovs\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567472 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567503 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-ovn-controller-tls-certs\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567553 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-run\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567599 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z894b\" (UniqueName: \"kubernetes.io/projected/149434b0-ace1-4e8f-9be4-76eb650f7c7f-kube-api-access-z894b\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567619 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/149434b0-ace1-4e8f-9be4-76eb650f7c7f-scripts\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567638 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-log-ovn\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567653 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28c984f-04eb-4398-af98-9e2c5e6afd13-scripts\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.567979 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-log\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.568173 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.568326 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-run\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.568342 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-etc-ovs\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.568396 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-lib\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.568432 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-log-ovn\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.569833 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run-ovn\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.569900 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28c984f-04eb-4398-af98-9e2c5e6afd13-scripts\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.570426 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/149434b0-ace1-4e8f-9be4-76eb650f7c7f-scripts\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.579734 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-ovn-controller-tls-certs\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.582676 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lwfk\" (UniqueName: \"kubernetes.io/projected/f28c984f-04eb-4398-af98-9e2c5e6afd13-kube-api-access-9lwfk\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.584535 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-combined-ca-bundle\") pod \"ovn-controller-s25fz\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.586656 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z894b\" (UniqueName: \"kubernetes.io/projected/149434b0-ace1-4e8f-9be4-76eb650f7c7f-kube-api-access-z894b\") pod \"ovn-controller-ovs-h9kbs\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.615198 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.616740 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.618947 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.624457 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.624654 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-w6zzw" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.624925 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.625141 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.646247 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.656936 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.668441 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvhdn\" (UniqueName: \"kubernetes.io/projected/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-kube-api-access-mvhdn\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.668492 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.668533 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.668560 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.668590 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.668620 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.668637 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.668676 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-config\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.669382 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.770574 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.770675 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.770747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.770807 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.770832 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.771393 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.772741 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-config\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.772857 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.772876 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvhdn\" (UniqueName: \"kubernetes.io/projected/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-kube-api-access-mvhdn\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.772949 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.773554 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-config\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.773549 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.775137 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.776362 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.787303 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.788682 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvhdn\" (UniqueName: \"kubernetes.io/projected/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-kube-api-access-mvhdn\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.794860 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:46 crc kubenswrapper[4766]: I1209 03:32:46.961642 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.724651 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.726129 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.730149 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.730350 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-b7f55" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.730497 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.730737 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.740255 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.807597 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.807669 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m29xs\" (UniqueName: \"kubernetes.io/projected/bf108478-7651-4f37-b0e7-3a571774d030-kube-api-access-m29xs\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.807718 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.807737 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-config\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.808051 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.808118 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf108478-7651-4f37-b0e7-3a571774d030-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.808198 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.808271 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.909408 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.909462 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-config\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.909527 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.909558 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf108478-7651-4f37-b0e7-3a571774d030-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.910098 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.910130 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.910166 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.910224 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m29xs\" (UniqueName: \"kubernetes.io/projected/bf108478-7651-4f37-b0e7-3a571774d030-kube-api-access-m29xs\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.910489 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-config\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.909804 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.911441 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.911542 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf108478-7651-4f37-b0e7-3a571774d030-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.915650 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.922705 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.924473 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.932815 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m29xs\" (UniqueName: \"kubernetes.io/projected/bf108478-7651-4f37-b0e7-3a571774d030-kube-api-access-m29xs\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:48 crc kubenswrapper[4766]: I1209 03:32:48.960720 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " pod="openstack/ovsdbserver-sb-0" Dec 09 03:32:49 crc kubenswrapper[4766]: I1209 03:32:49.054396 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 03:33:07 crc kubenswrapper[4766]: I1209 03:33:07.316457 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:33:07 crc kubenswrapper[4766]: I1209 03:33:07.316937 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:33:10 crc kubenswrapper[4766]: E1209 03:33:10.298942 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 09 03:33:10 crc kubenswrapper[4766]: E1209 03:33:10.300084 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wvgc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(a57927d7-7099-4b87-99ee-77aa589cd09f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:33:10 crc kubenswrapper[4766]: E1209 03:33:10.301342 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="a57927d7-7099-4b87-99ee-77aa589cd09f" Dec 09 03:33:10 crc kubenswrapper[4766]: E1209 03:33:10.339449 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 09 03:33:10 crc kubenswrapper[4766]: E1209 03:33:10.339600 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bktq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(26d1d344-fbf5-415d-952e-9ee50493a134): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:33:10 crc kubenswrapper[4766]: E1209 03:33:10.341336 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" Dec 09 03:33:10 crc kubenswrapper[4766]: E1209 03:33:10.343473 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 09 03:33:10 crc kubenswrapper[4766]: E1209 03:33:10.343604 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9gn76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(3af438c1-d0b9-4ecb-bb88-a0efd14736a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:33:10 crc kubenswrapper[4766]: E1209 03:33:10.344815 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" Dec 09 03:33:11 crc kubenswrapper[4766]: E1209 03:33:11.126647 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" Dec 09 03:33:11 crc kubenswrapper[4766]: E1209 03:33:11.126759 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="a57927d7-7099-4b87-99ee-77aa589cd09f" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.611395 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.611720 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b5wv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-kr5sq_openstack(231db44a-48fc-4dd4-95b3-76cea8a3ef13): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.612185 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.612333 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxcj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-k8khs_openstack(cb455f12-00ad-41c6-8035-4384770f42ed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.613374 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" podUID="231db44a-48fc-4dd4-95b3-76cea8a3ef13" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.613428 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" podUID="cb455f12-00ad-41c6-8035-4384770f42ed" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.624409 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.624817 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bqc96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-cpkpv_openstack(079f2043-c863-48cb-b833-cee2091fcdc7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.626021 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" podUID="079f2043-c863-48cb-b833-cee2091fcdc7" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.628857 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.629006 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tc754,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-88pjz_openstack(44ec5489-8c07-4223-875f-d0e43796ff74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:33:14 crc kubenswrapper[4766]: E1209 03:33:14.630831 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" podUID="44ec5489-8c07-4223-875f-d0e43796ff74" Dec 09 03:33:14 crc kubenswrapper[4766]: I1209 03:33:14.951465 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s25fz"] Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.164816 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz" event={"ID":"f28c984f-04eb-4398-af98-9e2c5e6afd13","Type":"ContainerStarted","Data":"c92b2db5364b6a896e5c6ccd0c8f9f1bcb96154372510e3ce5fc3f0d3bc9e4e8"} Dec 09 03:33:15 crc kubenswrapper[4766]: E1209 03:33:15.168890 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" podUID="cb455f12-00ad-41c6-8035-4384770f42ed" Dec 09 03:33:15 crc kubenswrapper[4766]: E1209 03:33:15.169021 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" podUID="44ec5489-8c07-4223-875f-d0e43796ff74" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.267048 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.328951 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.658322 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.668253 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:33:15 crc kubenswrapper[4766]: E1209 03:33:15.668565 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:9aee425378d2c16cd44177dc54a274b312897f5860a8e78fdfda555a0d79dd71: Get \"https://registry.k8s.io/v2/kube-state-metrics/kube-state-metrics/blobs/sha256:9aee425378d2c16cd44177dc54a274b312897f5860a8e78fdfda555a0d79dd71\": context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 09 03:33:15 crc kubenswrapper[4766]: E1209 03:33:15.668631 4766 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:9aee425378d2c16cd44177dc54a274b312897f5860a8e78fdfda555a0d79dd71: Get \"https://registry.k8s.io/v2/kube-state-metrics/kube-state-metrics/blobs/sha256:9aee425378d2c16cd44177dc54a274b312897f5860a8e78fdfda555a0d79dd71\": context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 09 03:33:15 crc kubenswrapper[4766]: E1209 03:33:15.669033 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9vlxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(292bc9d3-3333-4974-93c2-966d76dfa582): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:9aee425378d2c16cd44177dc54a274b312897f5860a8e78fdfda555a0d79dd71: Get \"https://registry.k8s.io/v2/kube-state-metrics/kube-state-metrics/blobs/sha256:9aee425378d2c16cd44177dc54a274b312897f5860a8e78fdfda555a0d79dd71\": context canceled" logger="UnhandledError" Dec 09 03:33:15 crc kubenswrapper[4766]: E1209 03:33:15.670496 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:9aee425378d2c16cd44177dc54a274b312897f5860a8e78fdfda555a0d79dd71: Get \\\"https://registry.k8s.io/v2/kube-state-metrics/kube-state-metrics/blobs/sha256:9aee425378d2c16cd44177dc54a274b312897f5860a8e78fdfda555a0d79dd71\\\": context canceled\"" pod="openstack/kube-state-metrics-0" podUID="292bc9d3-3333-4974-93c2-966d76dfa582" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.742975 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-h9kbs"] Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.801183 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqc96\" (UniqueName: \"kubernetes.io/projected/079f2043-c863-48cb-b833-cee2091fcdc7-kube-api-access-bqc96\") pod \"079f2043-c863-48cb-b833-cee2091fcdc7\" (UID: \"079f2043-c863-48cb-b833-cee2091fcdc7\") " Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.801233 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-config\") pod \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.801273 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079f2043-c863-48cb-b833-cee2091fcdc7-config\") pod \"079f2043-c863-48cb-b833-cee2091fcdc7\" (UID: \"079f2043-c863-48cb-b833-cee2091fcdc7\") " Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.801345 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-dns-svc\") pod \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.801361 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/231db44a-48fc-4dd4-95b3-76cea8a3ef13-kube-api-access-b5wv5\") pod \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\" (UID: \"231db44a-48fc-4dd4-95b3-76cea8a3ef13\") " Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.801688 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-config" (OuterVolumeSpecName: "config") pod "231db44a-48fc-4dd4-95b3-76cea8a3ef13" (UID: "231db44a-48fc-4dd4-95b3-76cea8a3ef13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.801853 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079f2043-c863-48cb-b833-cee2091fcdc7-config" (OuterVolumeSpecName: "config") pod "079f2043-c863-48cb-b833-cee2091fcdc7" (UID: "079f2043-c863-48cb-b833-cee2091fcdc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.801935 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "231db44a-48fc-4dd4-95b3-76cea8a3ef13" (UID: "231db44a-48fc-4dd4-95b3-76cea8a3ef13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.806503 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079f2043-c863-48cb-b833-cee2091fcdc7-kube-api-access-bqc96" (OuterVolumeSpecName: "kube-api-access-bqc96") pod "079f2043-c863-48cb-b833-cee2091fcdc7" (UID: "079f2043-c863-48cb-b833-cee2091fcdc7"). InnerVolumeSpecName "kube-api-access-bqc96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.806551 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231db44a-48fc-4dd4-95b3-76cea8a3ef13-kube-api-access-b5wv5" (OuterVolumeSpecName: "kube-api-access-b5wv5") pod "231db44a-48fc-4dd4-95b3-76cea8a3ef13" (UID: "231db44a-48fc-4dd4-95b3-76cea8a3ef13"). InnerVolumeSpecName "kube-api-access-b5wv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.910719 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.910755 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5wv5\" (UniqueName: \"kubernetes.io/projected/231db44a-48fc-4dd4-95b3-76cea8a3ef13-kube-api-access-b5wv5\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.910770 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqc96\" (UniqueName: \"kubernetes.io/projected/079f2043-c863-48cb-b833-cee2091fcdc7-kube-api-access-bqc96\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.910783 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231db44a-48fc-4dd4-95b3-76cea8a3ef13-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:15 crc kubenswrapper[4766]: I1209 03:33:15.910796 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079f2043-c863-48cb-b833-cee2091fcdc7-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.177777 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af","Type":"ContainerStarted","Data":"a4a8ba2bfdd506a1c1b89820974c6e51623c0fe28b6eedb24965e834e14c36cd"} Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.179467 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h9kbs" event={"ID":"149434b0-ace1-4e8f-9be4-76eb650f7c7f","Type":"ContainerStarted","Data":"a788d5ee525d091bda823d25451d7e2fd0e5db2d8e284abdcb3d065876a6baf2"} Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.180986 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" event={"ID":"231db44a-48fc-4dd4-95b3-76cea8a3ef13","Type":"ContainerDied","Data":"da1f22d8f30655273e53c21bf9efafba63ae4f4d85077676969218cd534221b8"} Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.181056 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kr5sq" Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.182060 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" event={"ID":"079f2043-c863-48cb-b833-cee2091fcdc7","Type":"ContainerDied","Data":"4c8704b17962fa34aa18ded663a43aafe5ca6fe34083143c44fb17daff9ab6a8"} Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.182133 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cpkpv" Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.184499 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bf108478-7651-4f37-b0e7-3a571774d030","Type":"ContainerStarted","Data":"f46190535232e13e382b5632c2fd59eb976e2a34a4308e7615d3a96df87994ca"} Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.186184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"01ada9c6-91af-4717-a157-29070bf61a6e","Type":"ContainerStarted","Data":"d07009fe163e1ff1f0efb68a9304c65721f93731af9b5db45462d77518e37f29"} Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.186250 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 09 03:33:16 crc kubenswrapper[4766]: E1209 03:33:16.189333 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="292bc9d3-3333-4974-93c2-966d76dfa582" Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.226128 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.6661961549999997 podStartE2EDuration="36.226107249s" podCreationTimestamp="2025-12-09 03:32:40 +0000 UTC" firstStartedPulling="2025-12-09 03:32:41.349972853 +0000 UTC m=+1243.059278279" lastFinishedPulling="2025-12-09 03:33:13.909883947 +0000 UTC m=+1275.619189373" observedRunningTime="2025-12-09 03:33:16.222726518 +0000 UTC m=+1277.932031954" watchObservedRunningTime="2025-12-09 03:33:16.226107249 +0000 UTC m=+1277.935412675" Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.261872 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kr5sq"] Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.275685 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kr5sq"] Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.291145 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cpkpv"] Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.298777 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cpkpv"] Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.851287 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079f2043-c863-48cb-b833-cee2091fcdc7" path="/var/lib/kubelet/pods/079f2043-c863-48cb-b833-cee2091fcdc7/volumes" Dec 09 03:33:16 crc kubenswrapper[4766]: I1209 03:33:16.851737 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231db44a-48fc-4dd4-95b3-76cea8a3ef13" path="/var/lib/kubelet/pods/231db44a-48fc-4dd4-95b3-76cea8a3ef13/volumes" Dec 09 03:33:17 crc kubenswrapper[4766]: I1209 03:33:17.196296 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48862672-08e2-4ac6-86a3-57d84bbc868d","Type":"ContainerStarted","Data":"953d3d0d70a4b3d2cd1bec6f32e0bdb69cc885c56c1891b2288ff895a80a5c71"} Dec 09 03:33:17 crc kubenswrapper[4766]: I1209 03:33:17.200850 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3af438c1-d0b9-4ecb-bb88-a0efd14736a4","Type":"ContainerStarted","Data":"89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238"} Dec 09 03:33:19 crc kubenswrapper[4766]: I1209 03:33:19.217151 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af","Type":"ContainerStarted","Data":"945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd"} Dec 09 03:33:19 crc kubenswrapper[4766]: I1209 03:33:19.218548 4766 generic.go:334] "Generic (PLEG): container finished" podID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerID="a0903176e01683b3ccdb86281a0e088025dfb50a20e870a2e7c0105d7f97b042" exitCode=0 Dec 09 03:33:19 crc kubenswrapper[4766]: I1209 03:33:19.218753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h9kbs" event={"ID":"149434b0-ace1-4e8f-9be4-76eb650f7c7f","Type":"ContainerDied","Data":"a0903176e01683b3ccdb86281a0e088025dfb50a20e870a2e7c0105d7f97b042"} Dec 09 03:33:19 crc kubenswrapper[4766]: I1209 03:33:19.219779 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz" event={"ID":"f28c984f-04eb-4398-af98-9e2c5e6afd13","Type":"ContainerStarted","Data":"675041bc225bfcd7e5dd3621e06519a2045ddfa0f1dfb2b19585bfe57589d512"} Dec 09 03:33:19 crc kubenswrapper[4766]: I1209 03:33:19.219908 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-s25fz" Dec 09 03:33:19 crc kubenswrapper[4766]: I1209 03:33:19.221364 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bf108478-7651-4f37-b0e7-3a571774d030","Type":"ContainerStarted","Data":"2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1"} Dec 09 03:33:19 crc kubenswrapper[4766]: I1209 03:33:19.257062 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s25fz" podStartSLOduration=29.785086918 podStartE2EDuration="33.257028811s" podCreationTimestamp="2025-12-09 03:32:46 +0000 UTC" firstStartedPulling="2025-12-09 03:33:15.102418689 +0000 UTC m=+1276.811724115" lastFinishedPulling="2025-12-09 03:33:18.574360582 +0000 UTC m=+1280.283666008" observedRunningTime="2025-12-09 03:33:19.253511977 +0000 UTC m=+1280.962817413" watchObservedRunningTime="2025-12-09 03:33:19.257028811 +0000 UTC m=+1280.966334237" Dec 09 03:33:20 crc kubenswrapper[4766]: I1209 03:33:20.230435 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h9kbs" event={"ID":"149434b0-ace1-4e8f-9be4-76eb650f7c7f","Type":"ContainerStarted","Data":"1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48"} Dec 09 03:33:20 crc kubenswrapper[4766]: I1209 03:33:20.230813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h9kbs" event={"ID":"149434b0-ace1-4e8f-9be4-76eb650f7c7f","Type":"ContainerStarted","Data":"d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1"} Dec 09 03:33:20 crc kubenswrapper[4766]: I1209 03:33:20.269837 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-h9kbs" podStartSLOduration=31.444887847 podStartE2EDuration="34.26981445s" podCreationTimestamp="2025-12-09 03:32:46 +0000 UTC" firstStartedPulling="2025-12-09 03:33:15.74649624 +0000 UTC m=+1277.455801666" lastFinishedPulling="2025-12-09 03:33:18.571422843 +0000 UTC m=+1280.280728269" observedRunningTime="2025-12-09 03:33:20.253538702 +0000 UTC m=+1281.962844138" watchObservedRunningTime="2025-12-09 03:33:20.26981445 +0000 UTC m=+1281.979119876" Dec 09 03:33:20 crc kubenswrapper[4766]: I1209 03:33:20.590984 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 09 03:33:21 crc kubenswrapper[4766]: I1209 03:33:21.238074 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:33:21 crc kubenswrapper[4766]: I1209 03:33:21.238460 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.217355 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-88pjz"] Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.261038 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-nmg8j"] Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.262405 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.263610 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bf108478-7651-4f37-b0e7-3a571774d030","Type":"ContainerStarted","Data":"eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033"} Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.281792 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-nmg8j"] Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.287731 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af","Type":"ContainerStarted","Data":"16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754"} Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.331199 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=30.740844707 podStartE2EDuration="37.331176876s" podCreationTimestamp="2025-12-09 03:32:45 +0000 UTC" firstStartedPulling="2025-12-09 03:33:15.339029112 +0000 UTC m=+1277.048334538" lastFinishedPulling="2025-12-09 03:33:21.929361281 +0000 UTC m=+1283.638666707" observedRunningTime="2025-12-09 03:33:22.329576264 +0000 UTC m=+1284.038881700" watchObservedRunningTime="2025-12-09 03:33:22.331176876 +0000 UTC m=+1284.040482302" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.375268 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=28.895052535 podStartE2EDuration="35.375226501s" podCreationTimestamp="2025-12-09 03:32:47 +0000 UTC" firstStartedPulling="2025-12-09 03:33:15.440203733 +0000 UTC m=+1277.149509169" lastFinishedPulling="2025-12-09 03:33:21.920377709 +0000 UTC m=+1283.629683135" observedRunningTime="2025-12-09 03:33:22.358768129 +0000 UTC m=+1284.068073555" watchObservedRunningTime="2025-12-09 03:33:22.375226501 +0000 UTC m=+1284.084531927" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.467176 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcs6b\" (UniqueName: \"kubernetes.io/projected/f179197a-80ec-4ef2-8507-46f90b036562-kube-api-access-zcs6b\") pod \"dnsmasq-dns-7cb5889db5-nmg8j\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.467634 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-config\") pod \"dnsmasq-dns-7cb5889db5-nmg8j\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.467675 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-nmg8j\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.554423 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.573204 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcs6b\" (UniqueName: \"kubernetes.io/projected/f179197a-80ec-4ef2-8507-46f90b036562-kube-api-access-zcs6b\") pod \"dnsmasq-dns-7cb5889db5-nmg8j\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.573322 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-config\") pod \"dnsmasq-dns-7cb5889db5-nmg8j\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.573349 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-nmg8j\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.574378 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-nmg8j\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.575332 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-config\") pod \"dnsmasq-dns-7cb5889db5-nmg8j\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.593903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcs6b\" (UniqueName: \"kubernetes.io/projected/f179197a-80ec-4ef2-8507-46f90b036562-kube-api-access-zcs6b\") pod \"dnsmasq-dns-7cb5889db5-nmg8j\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.602651 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.674555 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-config\") pod \"44ec5489-8c07-4223-875f-d0e43796ff74\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.674591 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-dns-svc\") pod \"44ec5489-8c07-4223-875f-d0e43796ff74\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.674628 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc754\" (UniqueName: \"kubernetes.io/projected/44ec5489-8c07-4223-875f-d0e43796ff74-kube-api-access-tc754\") pod \"44ec5489-8c07-4223-875f-d0e43796ff74\" (UID: \"44ec5489-8c07-4223-875f-d0e43796ff74\") " Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.674972 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44ec5489-8c07-4223-875f-d0e43796ff74" (UID: "44ec5489-8c07-4223-875f-d0e43796ff74"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.675092 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-config" (OuterVolumeSpecName: "config") pod "44ec5489-8c07-4223-875f-d0e43796ff74" (UID: "44ec5489-8c07-4223-875f-d0e43796ff74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.681479 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ec5489-8c07-4223-875f-d0e43796ff74-kube-api-access-tc754" (OuterVolumeSpecName: "kube-api-access-tc754") pod "44ec5489-8c07-4223-875f-d0e43796ff74" (UID: "44ec5489-8c07-4223-875f-d0e43796ff74"). InnerVolumeSpecName "kube-api-access-tc754". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.776772 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.777107 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44ec5489-8c07-4223-875f-d0e43796ff74-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.777122 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc754\" (UniqueName: \"kubernetes.io/projected/44ec5489-8c07-4223-875f-d0e43796ff74-kube-api-access-tc754\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:22 crc kubenswrapper[4766]: I1209 03:33:22.963688 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.012773 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.036027 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-nmg8j"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.304927 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" event={"ID":"44ec5489-8c07-4223-875f-d0e43796ff74","Type":"ContainerDied","Data":"39f124cdc3fd29a941802cdd5439d8093a5290336cf9c822ea1cf1dd3883219d"} Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.305007 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-88pjz" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.306429 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" event={"ID":"f179197a-80ec-4ef2-8507-46f90b036562","Type":"ContainerStarted","Data":"3fc8c283adeda0c9955ae9338ccdf3c2bc8bccd77ffbe402efa2285c2fefd333"} Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.312843 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26d1d344-fbf5-415d-952e-9ee50493a134","Type":"ContainerStarted","Data":"43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133"} Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.314473 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.347781 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-88pjz"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.372279 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-88pjz"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.377973 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.384783 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.391179 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.393393 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.393768 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-v42vz" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.393912 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.394051 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.400921 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.486799 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-cache\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.486951 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.487395 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbl5z\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-kube-api-access-zbl5z\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.487649 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-lock\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.487695 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.589014 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-cache\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.589087 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.589118 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbl5z\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-kube-api-access-zbl5z\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.589161 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-lock\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.589182 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: E1209 03:33:23.589283 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 03:33:23 crc kubenswrapper[4766]: E1209 03:33:23.589303 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 03:33:23 crc kubenswrapper[4766]: E1209 03:33:23.589360 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift podName:a13b4958-6576-4cdb-8237-7e8bedeef9fc nodeName:}" failed. No retries permitted until 2025-12-09 03:33:24.089342444 +0000 UTC m=+1285.798647870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift") pod "swift-storage-0" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc") : configmap "swift-ring-files" not found Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.589480 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.589728 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-cache\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.589781 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-lock\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.614457 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbl5z\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-kube-api-access-zbl5z\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.618376 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.628129 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-k8khs"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.664707 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-mbcht"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.665942 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.668108 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.683459 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-mbcht"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.795270 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6gpb\" (UniqueName: \"kubernetes.io/projected/3f4759c2-d12a-49a4-b246-824198f9fd2a-kube-api-access-w6gpb\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.795421 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.795487 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.795542 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-config\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.861710 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-d2mrs"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.863653 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.869764 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.874821 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d2mrs"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.889840 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-hb6ct"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.890941 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.894146 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.894320 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.894448 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.896424 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hb6ct"] Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.896601 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6gpb\" (UniqueName: \"kubernetes.io/projected/3f4759c2-d12a-49a4-b246-824198f9fd2a-kube-api-access-w6gpb\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.896652 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.896678 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.896703 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-config\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.897518 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.897603 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.910487 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-config\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.916852 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6gpb\" (UniqueName: \"kubernetes.io/projected/3f4759c2-d12a-49a4-b246-824198f9fd2a-kube-api-access-w6gpb\") pod \"dnsmasq-dns-74f6f696b9-mbcht\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.998827 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-scripts\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.998890 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/068033ee-d9d8-4cbb-b82a-ced63f563e08-etc-swift\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.998911 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-combined-ca-bundle\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.998945 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lntv\" (UniqueName: \"kubernetes.io/projected/1554173f-b66c-43d5-a5e4-cd10a81f09d4-kube-api-access-4lntv\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.998966 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxlt7\" (UniqueName: \"kubernetes.io/projected/068033ee-d9d8-4cbb-b82a-ced63f563e08-kube-api-access-vxlt7\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.998990 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1554173f-b66c-43d5-a5e4-cd10a81f09d4-config\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.999005 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-dispersionconf\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.999028 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-swiftconf\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.999072 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovs-rundir\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.999095 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-combined-ca-bundle\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.999121 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-ring-data-devices\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.999144 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovn-rundir\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:23 crc kubenswrapper[4766]: I1209 03:33:23.999161 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.015333 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.030720 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.055281 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.100480 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-config\") pod \"cb455f12-00ad-41c6-8035-4384770f42ed\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.100644 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxcj6\" (UniqueName: \"kubernetes.io/projected/cb455f12-00ad-41c6-8035-4384770f42ed-kube-api-access-bxcj6\") pod \"cb455f12-00ad-41c6-8035-4384770f42ed\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.100922 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-config" (OuterVolumeSpecName: "config") pod "cb455f12-00ad-41c6-8035-4384770f42ed" (UID: "cb455f12-00ad-41c6-8035-4384770f42ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.101352 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-dns-svc\") pod \"cb455f12-00ad-41c6-8035-4384770f42ed\" (UID: \"cb455f12-00ad-41c6-8035-4384770f42ed\") " Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.101586 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lntv\" (UniqueName: \"kubernetes.io/projected/1554173f-b66c-43d5-a5e4-cd10a81f09d4-kube-api-access-4lntv\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.101616 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxlt7\" (UniqueName: \"kubernetes.io/projected/068033ee-d9d8-4cbb-b82a-ced63f563e08-kube-api-access-vxlt7\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.101645 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1554173f-b66c-43d5-a5e4-cd10a81f09d4-config\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.101665 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-dispersionconf\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.101688 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-swiftconf\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.101838 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb455f12-00ad-41c6-8035-4384770f42ed" (UID: "cb455f12-00ad-41c6-8035-4384770f42ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102047 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102084 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovs-rundir\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102116 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-combined-ca-bundle\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102145 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-ring-data-devices\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102172 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovn-rundir\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102192 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102240 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-scripts\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102271 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/068033ee-d9d8-4cbb-b82a-ced63f563e08-etc-swift\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102291 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-combined-ca-bundle\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102869 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovn-rundir\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.102971 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.103492 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1554173f-b66c-43d5-a5e4-cd10a81f09d4-config\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.103569 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-ring-data-devices\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.103930 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-scripts\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: E1209 03:33:24.104000 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 03:33:24 crc kubenswrapper[4766]: E1209 03:33:24.104013 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 03:33:24 crc kubenswrapper[4766]: E1209 03:33:24.104049 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift podName:a13b4958-6576-4cdb-8237-7e8bedeef9fc nodeName:}" failed. No retries permitted until 2025-12-09 03:33:25.104035985 +0000 UTC m=+1286.813341411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift") pod "swift-storage-0" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc") : configmap "swift-ring-files" not found Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.104105 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/068033ee-d9d8-4cbb-b82a-ced63f563e08-etc-swift\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.104477 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb455f12-00ad-41c6-8035-4384770f42ed-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.106338 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb455f12-00ad-41c6-8035-4384770f42ed-kube-api-access-bxcj6" (OuterVolumeSpecName: "kube-api-access-bxcj6") pod "cb455f12-00ad-41c6-8035-4384770f42ed" (UID: "cb455f12-00ad-41c6-8035-4384770f42ed"). InnerVolumeSpecName "kube-api-access-bxcj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.107006 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovs-rundir\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.107026 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-combined-ca-bundle\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.112988 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-swiftconf\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.113075 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-combined-ca-bundle\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.114718 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-dispersionconf\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.118798 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.132468 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lntv\" (UniqueName: \"kubernetes.io/projected/1554173f-b66c-43d5-a5e4-cd10a81f09d4-kube-api-access-4lntv\") pod \"ovn-controller-metrics-d2mrs\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.134364 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxlt7\" (UniqueName: \"kubernetes.io/projected/068033ee-d9d8-4cbb-b82a-ced63f563e08-kube-api-access-vxlt7\") pod \"swift-ring-rebalance-hb6ct\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.143668 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-nmg8j"] Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.181388 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.205721 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxcj6\" (UniqueName: \"kubernetes.io/projected/cb455f12-00ad-41c6-8035-4384770f42ed-kube-api-access-bxcj6\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.209319 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-586lc"] Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.211312 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.213342 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.223211 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.242302 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-586lc"] Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.307129 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-dns-svc\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.307170 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.307191 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49ml5\" (UniqueName: \"kubernetes.io/projected/007b7fec-0143-464a-a4bf-9cf5354e5934-kube-api-access-49ml5\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.307358 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-config\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.307458 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.350305 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" event={"ID":"cb455f12-00ad-41c6-8035-4384770f42ed","Type":"ContainerDied","Data":"00d8a15fd3e007d2f91335fbc724959eac41e69d83c1c39a425a57da8ab410a1"} Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.350438 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-k8khs" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.361081 4766 generic.go:334] "Generic (PLEG): container finished" podID="f179197a-80ec-4ef2-8507-46f90b036562" containerID="2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f" exitCode=0 Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.362395 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" event={"ID":"f179197a-80ec-4ef2-8507-46f90b036562","Type":"ContainerDied","Data":"2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f"} Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.410000 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.410148 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-dns-svc\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.410170 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.410195 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49ml5\" (UniqueName: \"kubernetes.io/projected/007b7fec-0143-464a-a4bf-9cf5354e5934-kube-api-access-49ml5\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.410226 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-config\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.411904 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-dns-svc\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.412771 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.423075 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.423905 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-config\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.450306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49ml5\" (UniqueName: \"kubernetes.io/projected/007b7fec-0143-464a-a4bf-9cf5354e5934-kube-api-access-49ml5\") pod \"dnsmasq-dns-698758b865-586lc\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.458437 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-k8khs"] Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.473318 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-k8khs"] Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.535171 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.543202 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-mbcht"] Dec 09 03:33:24 crc kubenswrapper[4766]: W1209 03:33:24.553316 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f4759c2_d12a_49a4_b246_824198f9fd2a.slice/crio-1d0eaa3f5f61ea131f26daf8a35814eb63f7a5b7c013e7baebf054ca90e5fce3 WatchSource:0}: Error finding container 1d0eaa3f5f61ea131f26daf8a35814eb63f7a5b7c013e7baebf054ca90e5fce3: Status 404 returned error can't find the container with id 1d0eaa3f5f61ea131f26daf8a35814eb63f7a5b7c013e7baebf054ca90e5fce3 Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.727973 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hb6ct"] Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.748495 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d2mrs"] Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.852807 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ec5489-8c07-4223-875f-d0e43796ff74" path="/var/lib/kubelet/pods/44ec5489-8c07-4223-875f-d0e43796ff74/volumes" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.853798 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb455f12-00ad-41c6-8035-4384770f42ed" path="/var/lib/kubelet/pods/cb455f12-00ad-41c6-8035-4384770f42ed/volumes" Dec 09 03:33:24 crc kubenswrapper[4766]: I1209 03:33:24.866636 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-586lc"] Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.054644 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.097782 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.119586 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:25 crc kubenswrapper[4766]: E1209 03:33:25.119916 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 03:33:25 crc kubenswrapper[4766]: E1209 03:33:25.119943 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 03:33:25 crc kubenswrapper[4766]: E1209 03:33:25.119995 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift podName:a13b4958-6576-4cdb-8237-7e8bedeef9fc nodeName:}" failed. No retries permitted until 2025-12-09 03:33:27.119976628 +0000 UTC m=+1288.829282054 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift") pod "swift-storage-0" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc") : configmap "swift-ring-files" not found Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.370384 4766 generic.go:334] "Generic (PLEG): container finished" podID="007b7fec-0143-464a-a4bf-9cf5354e5934" containerID="0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb" exitCode=0 Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.370444 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-586lc" event={"ID":"007b7fec-0143-464a-a4bf-9cf5354e5934","Type":"ContainerDied","Data":"0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb"} Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.370790 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-586lc" event={"ID":"007b7fec-0143-464a-a4bf-9cf5354e5934","Type":"ContainerStarted","Data":"2c28f7e7972b4bfbbe65a248324150f845013f2525cf8731a6e08cacf8aaaae0"} Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.372264 4766 generic.go:334] "Generic (PLEG): container finished" podID="3f4759c2-d12a-49a4-b246-824198f9fd2a" containerID="27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6" exitCode=0 Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.372336 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" event={"ID":"3f4759c2-d12a-49a4-b246-824198f9fd2a","Type":"ContainerDied","Data":"27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6"} Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.372363 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" event={"ID":"3f4759c2-d12a-49a4-b246-824198f9fd2a","Type":"ContainerStarted","Data":"1d0eaa3f5f61ea131f26daf8a35814eb63f7a5b7c013e7baebf054ca90e5fce3"} Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.374648 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" event={"ID":"f179197a-80ec-4ef2-8507-46f90b036562","Type":"ContainerStarted","Data":"3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54"} Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.374803 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" podUID="f179197a-80ec-4ef2-8507-46f90b036562" containerName="dnsmasq-dns" containerID="cri-o://3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54" gracePeriod=10 Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.374903 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.377008 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d2mrs" event={"ID":"1554173f-b66c-43d5-a5e4-cd10a81f09d4","Type":"ContainerStarted","Data":"ea40d431b90f6866f082e789a7c9306a894b603e172fe4f549ac16e4a66c9f58"} Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.377040 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d2mrs" event={"ID":"1554173f-b66c-43d5-a5e4-cd10a81f09d4","Type":"ContainerStarted","Data":"1b0f3607954b354a6791b82c9ce793c30e8a89485397e093cde40f4f7445f1ed"} Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.380892 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hb6ct" event={"ID":"068033ee-d9d8-4cbb-b82a-ced63f563e08","Type":"ContainerStarted","Data":"9041f5cdc545206fc8c9a4bc640a0bb3988e2a56936d372a9f9096c677422ec3"} Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.433857 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.441535 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" podStartSLOduration=2.794224017 podStartE2EDuration="3.441518076s" podCreationTimestamp="2025-12-09 03:33:22 +0000 UTC" firstStartedPulling="2025-12-09 03:33:23.04596274 +0000 UTC m=+1284.755268166" lastFinishedPulling="2025-12-09 03:33:23.693256799 +0000 UTC m=+1285.402562225" observedRunningTime="2025-12-09 03:33:25.435461813 +0000 UTC m=+1287.144767239" watchObservedRunningTime="2025-12-09 03:33:25.441518076 +0000 UTC m=+1287.150823502" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.462991 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-d2mrs" podStartSLOduration=2.462967252 podStartE2EDuration="2.462967252s" podCreationTimestamp="2025-12-09 03:33:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:25.451045312 +0000 UTC m=+1287.160350738" watchObservedRunningTime="2025-12-09 03:33:25.462967252 +0000 UTC m=+1287.172272678" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.618377 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.619990 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.623547 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6xj4d" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.623577 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.623713 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.623861 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.639063 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.749236 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94jzf\" (UniqueName: \"kubernetes.io/projected/45ff249a-854d-4c30-8216-b7bd9482e08c-kube-api-access-94jzf\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.749847 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.749916 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-scripts\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.749943 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.749970 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-config\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.750174 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.750232 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.852266 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.852324 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.852366 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94jzf\" (UniqueName: \"kubernetes.io/projected/45ff249a-854d-4c30-8216-b7bd9482e08c-kube-api-access-94jzf\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.852399 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.852436 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-scripts\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.852461 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.852485 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-config\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.853490 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-scripts\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.855923 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-config\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.856971 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.859772 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.860832 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.868234 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.870088 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.879016 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94jzf\" (UniqueName: \"kubernetes.io/projected/45ff249a-854d-4c30-8216-b7bd9482e08c-kube-api-access-94jzf\") pod \"ovn-northd-0\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " pod="openstack/ovn-northd-0" Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.954112 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-config\") pod \"f179197a-80ec-4ef2-8507-46f90b036562\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.954431 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcs6b\" (UniqueName: \"kubernetes.io/projected/f179197a-80ec-4ef2-8507-46f90b036562-kube-api-access-zcs6b\") pod \"f179197a-80ec-4ef2-8507-46f90b036562\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.954501 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-dns-svc\") pod \"f179197a-80ec-4ef2-8507-46f90b036562\" (UID: \"f179197a-80ec-4ef2-8507-46f90b036562\") " Dec 09 03:33:25 crc kubenswrapper[4766]: I1209 03:33:25.962312 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f179197a-80ec-4ef2-8507-46f90b036562-kube-api-access-zcs6b" (OuterVolumeSpecName: "kube-api-access-zcs6b") pod "f179197a-80ec-4ef2-8507-46f90b036562" (UID: "f179197a-80ec-4ef2-8507-46f90b036562"). InnerVolumeSpecName "kube-api-access-zcs6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.013050 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.039040 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-config" (OuterVolumeSpecName: "config") pod "f179197a-80ec-4ef2-8507-46f90b036562" (UID: "f179197a-80ec-4ef2-8507-46f90b036562"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.051641 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f179197a-80ec-4ef2-8507-46f90b036562" (UID: "f179197a-80ec-4ef2-8507-46f90b036562"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.056387 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcs6b\" (UniqueName: \"kubernetes.io/projected/f179197a-80ec-4ef2-8507-46f90b036562-kube-api-access-zcs6b\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.056423 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.056435 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f179197a-80ec-4ef2-8507-46f90b036562-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.395251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" event={"ID":"3f4759c2-d12a-49a4-b246-824198f9fd2a","Type":"ContainerStarted","Data":"f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d"} Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.396911 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.400669 4766 generic.go:334] "Generic (PLEG): container finished" podID="f179197a-80ec-4ef2-8507-46f90b036562" containerID="3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54" exitCode=0 Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.400723 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.400864 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" event={"ID":"f179197a-80ec-4ef2-8507-46f90b036562","Type":"ContainerDied","Data":"3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54"} Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.400902 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-nmg8j" event={"ID":"f179197a-80ec-4ef2-8507-46f90b036562","Type":"ContainerDied","Data":"3fc8c283adeda0c9955ae9338ccdf3c2bc8bccd77ffbe402efa2285c2fefd333"} Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.400924 4766 scope.go:117] "RemoveContainer" containerID="3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.404876 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a57927d7-7099-4b87-99ee-77aa589cd09f","Type":"ContainerStarted","Data":"a5885e25c1b50574e74bf9082d1b22a74bc1bdd420d533adc98232798e6efdce"} Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.406905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-586lc" event={"ID":"007b7fec-0143-464a-a4bf-9cf5354e5934","Type":"ContainerStarted","Data":"0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf"} Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.417128 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" podStartSLOduration=3.417112643 podStartE2EDuration="3.417112643s" podCreationTimestamp="2025-12-09 03:33:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:26.41404498 +0000 UTC m=+1288.123350406" watchObservedRunningTime="2025-12-09 03:33:26.417112643 +0000 UTC m=+1288.126418069" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.439079 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-586lc" podStartSLOduration=2.439064673 podStartE2EDuration="2.439064673s" podCreationTimestamp="2025-12-09 03:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:26.437628554 +0000 UTC m=+1288.146933990" watchObservedRunningTime="2025-12-09 03:33:26.439064673 +0000 UTC m=+1288.148370099" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.453857 4766 scope.go:117] "RemoveContainer" containerID="2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.489922 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.497614 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-nmg8j"] Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.498682 4766 scope.go:117] "RemoveContainer" containerID="3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54" Dec 09 03:33:26 crc kubenswrapper[4766]: E1209 03:33:26.499168 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54\": container with ID starting with 3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54 not found: ID does not exist" containerID="3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.499192 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54"} err="failed to get container status \"3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54\": rpc error: code = NotFound desc = could not find container \"3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54\": container with ID starting with 3a67d669908c1c9b991b01e298de7deb52b3ad4fdccac49663e31f7fc4ef8d54 not found: ID does not exist" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.499225 4766 scope.go:117] "RemoveContainer" containerID="2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f" Dec 09 03:33:26 crc kubenswrapper[4766]: E1209 03:33:26.499919 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f\": container with ID starting with 2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f not found: ID does not exist" containerID="2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.500249 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f"} err="failed to get container status \"2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f\": rpc error: code = NotFound desc = could not find container \"2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f\": container with ID starting with 2c9a5f8ea9930db9f17169488ffffc6a4ddba27b16dd18f0df4bc62a35741d1f not found: ID does not exist" Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.513168 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-nmg8j"] Dec 09 03:33:26 crc kubenswrapper[4766]: I1209 03:33:26.858339 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f179197a-80ec-4ef2-8507-46f90b036562" path="/var/lib/kubelet/pods/f179197a-80ec-4ef2-8507-46f90b036562/volumes" Dec 09 03:33:27 crc kubenswrapper[4766]: I1209 03:33:27.176146 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:27 crc kubenswrapper[4766]: E1209 03:33:27.176351 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 03:33:27 crc kubenswrapper[4766]: E1209 03:33:27.176592 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 03:33:27 crc kubenswrapper[4766]: E1209 03:33:27.176653 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift podName:a13b4958-6576-4cdb-8237-7e8bedeef9fc nodeName:}" failed. No retries permitted until 2025-12-09 03:33:31.176634809 +0000 UTC m=+1292.885940235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift") pod "swift-storage-0" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc") : configmap "swift-ring-files" not found Dec 09 03:33:27 crc kubenswrapper[4766]: I1209 03:33:27.414808 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"45ff249a-854d-4c30-8216-b7bd9482e08c","Type":"ContainerStarted","Data":"9c1a2b9f960d6adbf0b4615bd89e14438352374dc11fc786a519c3297ce595e5"} Dec 09 03:33:27 crc kubenswrapper[4766]: I1209 03:33:27.416056 4766 generic.go:334] "Generic (PLEG): container finished" podID="26d1d344-fbf5-415d-952e-9ee50493a134" containerID="43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133" exitCode=0 Dec 09 03:33:27 crc kubenswrapper[4766]: I1209 03:33:27.416130 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26d1d344-fbf5-415d-952e-9ee50493a134","Type":"ContainerDied","Data":"43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133"} Dec 09 03:33:27 crc kubenswrapper[4766]: I1209 03:33:27.419801 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:29 crc kubenswrapper[4766]: I1209 03:33:29.438968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hb6ct" event={"ID":"068033ee-d9d8-4cbb-b82a-ced63f563e08","Type":"ContainerStarted","Data":"d49afed99ab414e48e5466fbe45ed0135ebd1a69a605f4158a48d0568bf92d18"} Dec 09 03:33:29 crc kubenswrapper[4766]: I1209 03:33:29.443577 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"45ff249a-854d-4c30-8216-b7bd9482e08c","Type":"ContainerStarted","Data":"bd3d08a30c68e8b92264c99f7fd188c185cb8959a30c11e9fd15099288bfbc41"} Dec 09 03:33:29 crc kubenswrapper[4766]: I1209 03:33:29.443608 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"45ff249a-854d-4c30-8216-b7bd9482e08c","Type":"ContainerStarted","Data":"24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333"} Dec 09 03:33:29 crc kubenswrapper[4766]: I1209 03:33:29.444063 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 09 03:33:29 crc kubenswrapper[4766]: I1209 03:33:29.445957 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26d1d344-fbf5-415d-952e-9ee50493a134","Type":"ContainerStarted","Data":"0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466"} Dec 09 03:33:29 crc kubenswrapper[4766]: I1209 03:33:29.465367 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-hb6ct" podStartSLOduration=2.863455713 podStartE2EDuration="6.465346811s" podCreationTimestamp="2025-12-09 03:33:23 +0000 UTC" firstStartedPulling="2025-12-09 03:33:24.776394598 +0000 UTC m=+1286.485700024" lastFinishedPulling="2025-12-09 03:33:28.378285696 +0000 UTC m=+1290.087591122" observedRunningTime="2025-12-09 03:33:29.46119546 +0000 UTC m=+1291.170500886" watchObservedRunningTime="2025-12-09 03:33:29.465346811 +0000 UTC m=+1291.174652247" Dec 09 03:33:29 crc kubenswrapper[4766]: I1209 03:33:29.494723 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.088743565 podStartE2EDuration="4.494702811s" podCreationTimestamp="2025-12-09 03:33:25 +0000 UTC" firstStartedPulling="2025-12-09 03:33:26.49882131 +0000 UTC m=+1288.208126736" lastFinishedPulling="2025-12-09 03:33:28.904780556 +0000 UTC m=+1290.614085982" observedRunningTime="2025-12-09 03:33:29.479671806 +0000 UTC m=+1291.188977232" watchObservedRunningTime="2025-12-09 03:33:29.494702811 +0000 UTC m=+1291.204008237" Dec 09 03:33:29 crc kubenswrapper[4766]: I1209 03:33:29.506989 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.992910503000001 podStartE2EDuration="52.506970491s" podCreationTimestamp="2025-12-09 03:32:37 +0000 UTC" firstStartedPulling="2025-12-09 03:32:39.82422492 +0000 UTC m=+1241.533530346" lastFinishedPulling="2025-12-09 03:33:22.338284908 +0000 UTC m=+1284.047590334" observedRunningTime="2025-12-09 03:33:29.499410317 +0000 UTC m=+1291.208715763" watchObservedRunningTime="2025-12-09 03:33:29.506970491 +0000 UTC m=+1291.216275917" Dec 09 03:33:30 crc kubenswrapper[4766]: I1209 03:33:30.466231 4766 generic.go:334] "Generic (PLEG): container finished" podID="a57927d7-7099-4b87-99ee-77aa589cd09f" containerID="a5885e25c1b50574e74bf9082d1b22a74bc1bdd420d533adc98232798e6efdce" exitCode=0 Dec 09 03:33:30 crc kubenswrapper[4766]: I1209 03:33:30.467180 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a57927d7-7099-4b87-99ee-77aa589cd09f","Type":"ContainerDied","Data":"a5885e25c1b50574e74bf9082d1b22a74bc1bdd420d533adc98232798e6efdce"} Dec 09 03:33:31 crc kubenswrapper[4766]: I1209 03:33:31.182332 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:31 crc kubenswrapper[4766]: E1209 03:33:31.182537 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 09 03:33:31 crc kubenswrapper[4766]: E1209 03:33:31.182808 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 09 03:33:31 crc kubenswrapper[4766]: E1209 03:33:31.182881 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift podName:a13b4958-6576-4cdb-8237-7e8bedeef9fc nodeName:}" failed. No retries permitted until 2025-12-09 03:33:39.182859391 +0000 UTC m=+1300.892164817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift") pod "swift-storage-0" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc") : configmap "swift-ring-files" not found Dec 09 03:33:31 crc kubenswrapper[4766]: I1209 03:33:31.478769 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a57927d7-7099-4b87-99ee-77aa589cd09f","Type":"ContainerStarted","Data":"703a57a4d0aff6c0b3a602487fcde7105c12acc5a93f46652d99af88e4758cbd"} Dec 09 03:33:31 crc kubenswrapper[4766]: I1209 03:33:31.520310 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371984.334494 podStartE2EDuration="52.520282926s" podCreationTimestamp="2025-12-09 03:32:39 +0000 UTC" firstStartedPulling="2025-12-09 03:32:41.030184313 +0000 UTC m=+1242.739489739" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:31.503948717 +0000 UTC m=+1293.213254183" watchObservedRunningTime="2025-12-09 03:33:31.520282926 +0000 UTC m=+1293.229588362" Dec 09 03:33:32 crc kubenswrapper[4766]: I1209 03:33:32.487686 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"292bc9d3-3333-4974-93c2-966d76dfa582","Type":"ContainerStarted","Data":"d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2"} Dec 09 03:33:32 crc kubenswrapper[4766]: I1209 03:33:32.488078 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 09 03:33:32 crc kubenswrapper[4766]: I1209 03:33:32.501677 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.265067048 podStartE2EDuration="51.501659149s" podCreationTimestamp="2025-12-09 03:32:41 +0000 UTC" firstStartedPulling="2025-12-09 03:32:42.988187551 +0000 UTC m=+1244.697492977" lastFinishedPulling="2025-12-09 03:33:32.224779652 +0000 UTC m=+1293.934085078" observedRunningTime="2025-12-09 03:33:32.501146935 +0000 UTC m=+1294.210452361" watchObservedRunningTime="2025-12-09 03:33:32.501659149 +0000 UTC m=+1294.210964575" Dec 09 03:33:34 crc kubenswrapper[4766]: I1209 03:33:34.032377 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:34 crc kubenswrapper[4766]: I1209 03:33:34.537156 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:33:34 crc kubenswrapper[4766]: I1209 03:33:34.587148 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-mbcht"] Dec 09 03:33:34 crc kubenswrapper[4766]: I1209 03:33:34.587383 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" podUID="3f4759c2-d12a-49a4-b246-824198f9fd2a" containerName="dnsmasq-dns" containerID="cri-o://f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d" gracePeriod=10 Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.041567 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.048175 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-config\") pod \"3f4759c2-d12a-49a4-b246-824198f9fd2a\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.048265 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6gpb\" (UniqueName: \"kubernetes.io/projected/3f4759c2-d12a-49a4-b246-824198f9fd2a-kube-api-access-w6gpb\") pod \"3f4759c2-d12a-49a4-b246-824198f9fd2a\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.048328 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-dns-svc\") pod \"3f4759c2-d12a-49a4-b246-824198f9fd2a\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.048412 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-ovsdbserver-nb\") pod \"3f4759c2-d12a-49a4-b246-824198f9fd2a\" (UID: \"3f4759c2-d12a-49a4-b246-824198f9fd2a\") " Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.110828 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4759c2-d12a-49a4-b246-824198f9fd2a-kube-api-access-w6gpb" (OuterVolumeSpecName: "kube-api-access-w6gpb") pod "3f4759c2-d12a-49a4-b246-824198f9fd2a" (UID: "3f4759c2-d12a-49a4-b246-824198f9fd2a"). InnerVolumeSpecName "kube-api-access-w6gpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.150030 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6gpb\" (UniqueName: \"kubernetes.io/projected/3f4759c2-d12a-49a4-b246-824198f9fd2a-kube-api-access-w6gpb\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.154062 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f4759c2-d12a-49a4-b246-824198f9fd2a" (UID: "3f4759c2-d12a-49a4-b246-824198f9fd2a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.167872 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-config" (OuterVolumeSpecName: "config") pod "3f4759c2-d12a-49a4-b246-824198f9fd2a" (UID: "3f4759c2-d12a-49a4-b246-824198f9fd2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.168782 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f4759c2-d12a-49a4-b246-824198f9fd2a" (UID: "3f4759c2-d12a-49a4-b246-824198f9fd2a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.251619 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.251656 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.251669 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f4759c2-d12a-49a4-b246-824198f9fd2a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.509666 4766 generic.go:334] "Generic (PLEG): container finished" podID="3f4759c2-d12a-49a4-b246-824198f9fd2a" containerID="f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d" exitCode=0 Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.509747 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" event={"ID":"3f4759c2-d12a-49a4-b246-824198f9fd2a","Type":"ContainerDied","Data":"f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d"} Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.509778 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" event={"ID":"3f4759c2-d12a-49a4-b246-824198f9fd2a","Type":"ContainerDied","Data":"1d0eaa3f5f61ea131f26daf8a35814eb63f7a5b7c013e7baebf054ca90e5fce3"} Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.509796 4766 scope.go:117] "RemoveContainer" containerID="f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.509820 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-mbcht" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.511722 4766 generic.go:334] "Generic (PLEG): container finished" podID="068033ee-d9d8-4cbb-b82a-ced63f563e08" containerID="d49afed99ab414e48e5466fbe45ed0135ebd1a69a605f4158a48d0568bf92d18" exitCode=0 Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.511768 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hb6ct" event={"ID":"068033ee-d9d8-4cbb-b82a-ced63f563e08","Type":"ContainerDied","Data":"d49afed99ab414e48e5466fbe45ed0135ebd1a69a605f4158a48d0568bf92d18"} Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.532669 4766 scope.go:117] "RemoveContainer" containerID="27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.558325 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-mbcht"] Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.564642 4766 scope.go:117] "RemoveContainer" containerID="f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d" Dec 09 03:33:35 crc kubenswrapper[4766]: E1209 03:33:35.565311 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d\": container with ID starting with f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d not found: ID does not exist" containerID="f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.565356 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d"} err="failed to get container status \"f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d\": rpc error: code = NotFound desc = could not find container \"f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d\": container with ID starting with f06dac9d81f9fec81446df4851e894a7d24c7f4b38bc73d5155f93900c857c2d not found: ID does not exist" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.565460 4766 scope.go:117] "RemoveContainer" containerID="27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6" Dec 09 03:33:35 crc kubenswrapper[4766]: E1209 03:33:35.565914 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6\": container with ID starting with 27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6 not found: ID does not exist" containerID="27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.565951 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6"} err="failed to get container status \"27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6\": rpc error: code = NotFound desc = could not find container \"27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6\": container with ID starting with 27a87dba4cb1ae944685917ae209fe04769b4821a8e7dcf9bcc923eb6ad695a6 not found: ID does not exist" Dec 09 03:33:35 crc kubenswrapper[4766]: I1209 03:33:35.568701 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-mbcht"] Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.820587 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.851554 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4759c2-d12a-49a4-b246-824198f9fd2a" path="/var/lib/kubelet/pods/3f4759c2-d12a-49a4-b246-824198f9fd2a/volumes" Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.976389 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-combined-ca-bundle\") pod \"068033ee-d9d8-4cbb-b82a-ced63f563e08\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.976490 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-ring-data-devices\") pod \"068033ee-d9d8-4cbb-b82a-ced63f563e08\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.976547 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxlt7\" (UniqueName: \"kubernetes.io/projected/068033ee-d9d8-4cbb-b82a-ced63f563e08-kube-api-access-vxlt7\") pod \"068033ee-d9d8-4cbb-b82a-ced63f563e08\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.976623 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/068033ee-d9d8-4cbb-b82a-ced63f563e08-etc-swift\") pod \"068033ee-d9d8-4cbb-b82a-ced63f563e08\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.976653 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-swiftconf\") pod \"068033ee-d9d8-4cbb-b82a-ced63f563e08\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.976751 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-dispersionconf\") pod \"068033ee-d9d8-4cbb-b82a-ced63f563e08\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.976783 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-scripts\") pod \"068033ee-d9d8-4cbb-b82a-ced63f563e08\" (UID: \"068033ee-d9d8-4cbb-b82a-ced63f563e08\") " Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.978044 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "068033ee-d9d8-4cbb-b82a-ced63f563e08" (UID: "068033ee-d9d8-4cbb-b82a-ced63f563e08"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.978540 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/068033ee-d9d8-4cbb-b82a-ced63f563e08-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "068033ee-d9d8-4cbb-b82a-ced63f563e08" (UID: "068033ee-d9d8-4cbb-b82a-ced63f563e08"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.983191 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068033ee-d9d8-4cbb-b82a-ced63f563e08-kube-api-access-vxlt7" (OuterVolumeSpecName: "kube-api-access-vxlt7") pod "068033ee-d9d8-4cbb-b82a-ced63f563e08" (UID: "068033ee-d9d8-4cbb-b82a-ced63f563e08"). InnerVolumeSpecName "kube-api-access-vxlt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.984944 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "068033ee-d9d8-4cbb-b82a-ced63f563e08" (UID: "068033ee-d9d8-4cbb-b82a-ced63f563e08"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:33:36 crc kubenswrapper[4766]: I1209 03:33:36.998064 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-scripts" (OuterVolumeSpecName: "scripts") pod "068033ee-d9d8-4cbb-b82a-ced63f563e08" (UID: "068033ee-d9d8-4cbb-b82a-ced63f563e08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.005506 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "068033ee-d9d8-4cbb-b82a-ced63f563e08" (UID: "068033ee-d9d8-4cbb-b82a-ced63f563e08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.006627 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "068033ee-d9d8-4cbb-b82a-ced63f563e08" (UID: "068033ee-d9d8-4cbb-b82a-ced63f563e08"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.078780 4766 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/068033ee-d9d8-4cbb-b82a-ced63f563e08-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.078818 4766 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.078828 4766 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.078841 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.078851 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/068033ee-d9d8-4cbb-b82a-ced63f563e08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.078863 4766 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/068033ee-d9d8-4cbb-b82a-ced63f563e08-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.078874 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxlt7\" (UniqueName: \"kubernetes.io/projected/068033ee-d9d8-4cbb-b82a-ced63f563e08-kube-api-access-vxlt7\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.316577 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.316916 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.529546 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hb6ct" event={"ID":"068033ee-d9d8-4cbb-b82a-ced63f563e08","Type":"ContainerDied","Data":"9041f5cdc545206fc8c9a4bc640a0bb3988e2a56936d372a9f9096c677422ec3"} Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.529787 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9041f5cdc545206fc8c9a4bc640a0bb3988e2a56936d372a9f9096c677422ec3" Dec 09 03:33:37 crc kubenswrapper[4766]: I1209 03:33:37.529602 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hb6ct" Dec 09 03:33:39 crc kubenswrapper[4766]: I1209 03:33:39.209924 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:39 crc kubenswrapper[4766]: I1209 03:33:39.220445 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") pod \"swift-storage-0\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " pod="openstack/swift-storage-0" Dec 09 03:33:39 crc kubenswrapper[4766]: I1209 03:33:39.310275 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 03:33:39 crc kubenswrapper[4766]: I1209 03:33:39.315363 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 09 03:33:39 crc kubenswrapper[4766]: I1209 03:33:39.315456 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 09 03:33:39 crc kubenswrapper[4766]: I1209 03:33:39.406396 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 09 03:33:39 crc kubenswrapper[4766]: I1209 03:33:39.613878 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 09 03:33:39 crc kubenswrapper[4766]: I1209 03:33:39.841694 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.295309 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-67ng6"] Dec 09 03:33:40 crc kubenswrapper[4766]: E1209 03:33:40.295713 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068033ee-d9d8-4cbb-b82a-ced63f563e08" containerName="swift-ring-rebalance" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.295735 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="068033ee-d9d8-4cbb-b82a-ced63f563e08" containerName="swift-ring-rebalance" Dec 09 03:33:40 crc kubenswrapper[4766]: E1209 03:33:40.295748 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f179197a-80ec-4ef2-8507-46f90b036562" containerName="dnsmasq-dns" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.295754 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f179197a-80ec-4ef2-8507-46f90b036562" containerName="dnsmasq-dns" Dec 09 03:33:40 crc kubenswrapper[4766]: E1209 03:33:40.295775 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4759c2-d12a-49a4-b246-824198f9fd2a" containerName="init" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.295781 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4759c2-d12a-49a4-b246-824198f9fd2a" containerName="init" Dec 09 03:33:40 crc kubenswrapper[4766]: E1209 03:33:40.295796 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f179197a-80ec-4ef2-8507-46f90b036562" containerName="init" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.295803 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f179197a-80ec-4ef2-8507-46f90b036562" containerName="init" Dec 09 03:33:40 crc kubenswrapper[4766]: E1209 03:33:40.295814 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4759c2-d12a-49a4-b246-824198f9fd2a" containerName="dnsmasq-dns" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.295820 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4759c2-d12a-49a4-b246-824198f9fd2a" containerName="dnsmasq-dns" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.295970 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="068033ee-d9d8-4cbb-b82a-ced63f563e08" containerName="swift-ring-rebalance" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.295986 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4759c2-d12a-49a4-b246-824198f9fd2a" containerName="dnsmasq-dns" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.295997 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f179197a-80ec-4ef2-8507-46f90b036562" containerName="dnsmasq-dns" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.296535 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.302890 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-49e7-account-create-update-ctlzz"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.304297 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.309367 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-67ng6"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.313809 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.315967 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-49e7-account-create-update-ctlzz"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.429515 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r622g\" (UniqueName: \"kubernetes.io/projected/41a4873b-e4c4-4284-97a6-baa8346e4849-kube-api-access-r622g\") pod \"keystone-49e7-account-create-update-ctlzz\" (UID: \"41a4873b-e4c4-4284-97a6-baa8346e4849\") " pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.429567 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-operator-scripts\") pod \"keystone-db-create-67ng6\" (UID: \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\") " pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.429776 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41a4873b-e4c4-4284-97a6-baa8346e4849-operator-scripts\") pod \"keystone-49e7-account-create-update-ctlzz\" (UID: \"41a4873b-e4c4-4284-97a6-baa8346e4849\") " pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.430145 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjcmx\" (UniqueName: \"kubernetes.io/projected/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-kube-api-access-sjcmx\") pod \"keystone-db-create-67ng6\" (UID: \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\") " pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.447375 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.447428 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.531784 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjcmx\" (UniqueName: \"kubernetes.io/projected/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-kube-api-access-sjcmx\") pod \"keystone-db-create-67ng6\" (UID: \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\") " pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.531907 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r622g\" (UniqueName: \"kubernetes.io/projected/41a4873b-e4c4-4284-97a6-baa8346e4849-kube-api-access-r622g\") pod \"keystone-49e7-account-create-update-ctlzz\" (UID: \"41a4873b-e4c4-4284-97a6-baa8346e4849\") " pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.531938 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-operator-scripts\") pod \"keystone-db-create-67ng6\" (UID: \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\") " pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.531990 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41a4873b-e4c4-4284-97a6-baa8346e4849-operator-scripts\") pod \"keystone-49e7-account-create-update-ctlzz\" (UID: \"41a4873b-e4c4-4284-97a6-baa8346e4849\") " pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.532918 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41a4873b-e4c4-4284-97a6-baa8346e4849-operator-scripts\") pod \"keystone-49e7-account-create-update-ctlzz\" (UID: \"41a4873b-e4c4-4284-97a6-baa8346e4849\") " pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.532920 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-operator-scripts\") pod \"keystone-db-create-67ng6\" (UID: \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\") " pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.552322 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"688fcfc4446b57ee29f3e81848e1d6362676296f66726b747133558fbc65c0c7"} Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.563132 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjcmx\" (UniqueName: \"kubernetes.io/projected/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-kube-api-access-sjcmx\") pod \"keystone-db-create-67ng6\" (UID: \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\") " pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.575852 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r622g\" (UniqueName: \"kubernetes.io/projected/41a4873b-e4c4-4284-97a6-baa8346e4849-kube-api-access-r622g\") pod \"keystone-49e7-account-create-update-ctlzz\" (UID: \"41a4873b-e4c4-4284-97a6-baa8346e4849\") " pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.577123 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-cqs82"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.578393 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cqs82" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.600224 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cqs82"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.633575 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.654681 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2528\" (UniqueName: \"kubernetes.io/projected/d66f5276-1f41-4add-b998-70930f40b6a7-kube-api-access-v2528\") pod \"placement-db-create-cqs82\" (UID: \"d66f5276-1f41-4add-b998-70930f40b6a7\") " pod="openstack/placement-db-create-cqs82" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.654798 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d66f5276-1f41-4add-b998-70930f40b6a7-operator-scripts\") pod \"placement-db-create-cqs82\" (UID: \"d66f5276-1f41-4add-b998-70930f40b6a7\") " pod="openstack/placement-db-create-cqs82" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.655164 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.733268 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.751291 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0abc-account-create-update-dxlcc"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.752658 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.758064 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.772733 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2528\" (UniqueName: \"kubernetes.io/projected/d66f5276-1f41-4add-b998-70930f40b6a7-kube-api-access-v2528\") pod \"placement-db-create-cqs82\" (UID: \"d66f5276-1f41-4add-b998-70930f40b6a7\") " pod="openstack/placement-db-create-cqs82" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.772801 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d66f5276-1f41-4add-b998-70930f40b6a7-operator-scripts\") pod \"placement-db-create-cqs82\" (UID: \"d66f5276-1f41-4add-b998-70930f40b6a7\") " pod="openstack/placement-db-create-cqs82" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.776239 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0abc-account-create-update-dxlcc"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.777960 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d66f5276-1f41-4add-b998-70930f40b6a7-operator-scripts\") pod \"placement-db-create-cqs82\" (UID: \"d66f5276-1f41-4add-b998-70930f40b6a7\") " pod="openstack/placement-db-create-cqs82" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.814391 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2528\" (UniqueName: \"kubernetes.io/projected/d66f5276-1f41-4add-b998-70930f40b6a7-kube-api-access-v2528\") pod \"placement-db-create-cqs82\" (UID: \"d66f5276-1f41-4add-b998-70930f40b6a7\") " pod="openstack/placement-db-create-cqs82" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.855896 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2ljqg"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.856859 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2ljqg"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.856937 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.879453 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgnn\" (UniqueName: \"kubernetes.io/projected/6e65ef84-392e-49dd-8f6d-ed4562e8524a-kube-api-access-9qgnn\") pod \"placement-0abc-account-create-update-dxlcc\" (UID: \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\") " pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.879696 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e65ef84-392e-49dd-8f6d-ed4562e8524a-operator-scripts\") pod \"placement-0abc-account-create-update-dxlcc\" (UID: \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\") " pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.888511 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.907102 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9417-account-create-update-59kg8"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.909353 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.920514 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9417-account-create-update-59kg8"] Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.922968 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.982061 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m5nc\" (UniqueName: \"kubernetes.io/projected/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-kube-api-access-2m5nc\") pod \"glance-db-create-2ljqg\" (UID: \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\") " pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.982115 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e65ef84-392e-49dd-8f6d-ed4562e8524a-operator-scripts\") pod \"placement-0abc-account-create-update-dxlcc\" (UID: \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\") " pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.982314 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-operator-scripts\") pod \"glance-db-create-2ljqg\" (UID: \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\") " pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.982359 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgnn\" (UniqueName: \"kubernetes.io/projected/6e65ef84-392e-49dd-8f6d-ed4562e8524a-kube-api-access-9qgnn\") pod \"placement-0abc-account-create-update-dxlcc\" (UID: \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\") " pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.983102 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e65ef84-392e-49dd-8f6d-ed4562e8524a-operator-scripts\") pod \"placement-0abc-account-create-update-dxlcc\" (UID: \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\") " pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:40 crc kubenswrapper[4766]: I1209 03:33:40.990741 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cqs82" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.001151 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgnn\" (UniqueName: \"kubernetes.io/projected/6e65ef84-392e-49dd-8f6d-ed4562e8524a-kube-api-access-9qgnn\") pod \"placement-0abc-account-create-update-dxlcc\" (UID: \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\") " pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.083897 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-operator-scripts\") pod \"glance-db-create-2ljqg\" (UID: \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\") " pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.083964 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85c956d0-404a-4be0-a2ba-27706be74ec0-operator-scripts\") pod \"glance-9417-account-create-update-59kg8\" (UID: \"85c956d0-404a-4be0-a2ba-27706be74ec0\") " pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.084010 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m5nc\" (UniqueName: \"kubernetes.io/projected/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-kube-api-access-2m5nc\") pod \"glance-db-create-2ljqg\" (UID: \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\") " pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.084050 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2w65\" (UniqueName: \"kubernetes.io/projected/85c956d0-404a-4be0-a2ba-27706be74ec0-kube-api-access-j2w65\") pod \"glance-9417-account-create-update-59kg8\" (UID: \"85c956d0-404a-4be0-a2ba-27706be74ec0\") " pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.084978 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-operator-scripts\") pod \"glance-db-create-2ljqg\" (UID: \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\") " pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.102886 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m5nc\" (UniqueName: \"kubernetes.io/projected/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-kube-api-access-2m5nc\") pod \"glance-db-create-2ljqg\" (UID: \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\") " pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.118139 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.118504 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.180429 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.184844 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85c956d0-404a-4be0-a2ba-27706be74ec0-operator-scripts\") pod \"glance-9417-account-create-update-59kg8\" (UID: \"85c956d0-404a-4be0-a2ba-27706be74ec0\") " pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.184919 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2w65\" (UniqueName: \"kubernetes.io/projected/85c956d0-404a-4be0-a2ba-27706be74ec0-kube-api-access-j2w65\") pod \"glance-9417-account-create-update-59kg8\" (UID: \"85c956d0-404a-4be0-a2ba-27706be74ec0\") " pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.186023 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85c956d0-404a-4be0-a2ba-27706be74ec0-operator-scripts\") pod \"glance-9417-account-create-update-59kg8\" (UID: \"85c956d0-404a-4be0-a2ba-27706be74ec0\") " pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.204638 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2w65\" (UniqueName: \"kubernetes.io/projected/85c956d0-404a-4be0-a2ba-27706be74ec0-kube-api-access-j2w65\") pod \"glance-9417-account-create-update-59kg8\" (UID: \"85c956d0-404a-4be0-a2ba-27706be74ec0\") " pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.253127 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.294836 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cqs82"] Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.364817 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-49e7-account-create-update-ctlzz"] Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.376756 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-67ng6"] Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.586604 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cqs82" event={"ID":"d66f5276-1f41-4add-b998-70930f40b6a7","Type":"ContainerStarted","Data":"170a8aa89bb03063fe3f32fe8c4a3c0ff911a6b2d669976e2b107060a6a690cb"} Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.588472 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-67ng6" event={"ID":"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa","Type":"ContainerStarted","Data":"2e5708243197fb1fe18646e47f2d07b4e1d56794dac5e13c6ee29d5d67f07360"} Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.589933 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-49e7-account-create-update-ctlzz" event={"ID":"41a4873b-e4c4-4284-97a6-baa8346e4849","Type":"ContainerStarted","Data":"a0015a46ce0325dd355eaaeb28f99db924598f5440bc2bae63d45b9484a2d412"} Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.633637 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0abc-account-create-update-dxlcc"] Dec 09 03:33:41 crc kubenswrapper[4766]: W1209 03:33:41.639395 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e65ef84_392e_49dd_8f6d_ed4562e8524a.slice/crio-6b11d4a1aab91f3d6852a5860f4119051783591d0a11d31526f6e8d73834917f WatchSource:0}: Error finding container 6b11d4a1aab91f3d6852a5860f4119051783591d0a11d31526f6e8d73834917f: Status 404 returned error can't find the container with id 6b11d4a1aab91f3d6852a5860f4119051783591d0a11d31526f6e8d73834917f Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.738494 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2ljqg"] Dec 09 03:33:41 crc kubenswrapper[4766]: I1209 03:33:41.790481 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9417-account-create-update-59kg8"] Dec 09 03:33:41 crc kubenswrapper[4766]: W1209 03:33:41.802428 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85c956d0_404a_4be0_a2ba_27706be74ec0.slice/crio-cc1a91f42344279bc68a7f51e4d5719577b34689a6c27ef7bacb84424646701c WatchSource:0}: Error finding container cc1a91f42344279bc68a7f51e4d5719577b34689a6c27ef7bacb84424646701c: Status 404 returned error can't find the container with id cc1a91f42344279bc68a7f51e4d5719577b34689a6c27ef7bacb84424646701c Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.295489 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.600715 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2ljqg" event={"ID":"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032","Type":"ContainerStarted","Data":"24729679fd0dcc450517c6906dbe60f8c99be4396dcda6ec1d41ff11e94c6452"} Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.600767 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2ljqg" event={"ID":"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032","Type":"ContainerStarted","Data":"ef6d35adcb65405d75e0cd0cea049c9109221606957ad2352d6ec1131167d97a"} Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.602891 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-49e7-account-create-update-ctlzz" event={"ID":"41a4873b-e4c4-4284-97a6-baa8346e4849","Type":"ContainerStarted","Data":"2d3338e237ba7af89e198aa31cd1ddc3d0b533524dcf30eb8fa1547e1b318aaf"} Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.605895 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0abc-account-create-update-dxlcc" event={"ID":"6e65ef84-392e-49dd-8f6d-ed4562e8524a","Type":"ContainerStarted","Data":"4f3f6e014efa4d6dfba0dea7aea4893fdb2153914420f7a8f8e5fd012360ef1f"} Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.605959 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0abc-account-create-update-dxlcc" event={"ID":"6e65ef84-392e-49dd-8f6d-ed4562e8524a","Type":"ContainerStarted","Data":"6b11d4a1aab91f3d6852a5860f4119051783591d0a11d31526f6e8d73834917f"} Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.611973 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cqs82" event={"ID":"d66f5276-1f41-4add-b998-70930f40b6a7","Type":"ContainerStarted","Data":"7e3d89ca9056493b6e6e8ac9b33bcad91da4a6aaccec7f99bacb242855bb6e95"} Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.624752 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9417-account-create-update-59kg8" event={"ID":"85c956d0-404a-4be0-a2ba-27706be74ec0","Type":"ContainerStarted","Data":"0b4f54ea8d642e05e2a555795cff1dc74694cf421659e3f43efd44714ff34125"} Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.625582 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9417-account-create-update-59kg8" event={"ID":"85c956d0-404a-4be0-a2ba-27706be74ec0","Type":"ContainerStarted","Data":"cc1a91f42344279bc68a7f51e4d5719577b34689a6c27ef7bacb84424646701c"} Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.624928 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-2ljqg" podStartSLOduration=2.624905931 podStartE2EDuration="2.624905931s" podCreationTimestamp="2025-12-09 03:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:42.616447103 +0000 UTC m=+1304.325752529" watchObservedRunningTime="2025-12-09 03:33:42.624905931 +0000 UTC m=+1304.334211357" Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.627776 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-67ng6" event={"ID":"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa","Type":"ContainerStarted","Data":"df2580fc6a7c68c79a9fa23dde958abfe360c1d442a6396b5b94f67428f72619"} Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.668703 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-49e7-account-create-update-ctlzz" podStartSLOduration=2.6686763669999998 podStartE2EDuration="2.668676367s" podCreationTimestamp="2025-12-09 03:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:42.641823576 +0000 UTC m=+1304.351129012" watchObservedRunningTime="2025-12-09 03:33:42.668676367 +0000 UTC m=+1304.377981793" Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.670756 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-0abc-account-create-update-dxlcc" podStartSLOduration=2.6707484729999997 podStartE2EDuration="2.670748473s" podCreationTimestamp="2025-12-09 03:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:42.662475431 +0000 UTC m=+1304.371780867" watchObservedRunningTime="2025-12-09 03:33:42.670748473 +0000 UTC m=+1304.380053899" Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.687661 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-cqs82" podStartSLOduration=2.687642968 podStartE2EDuration="2.687642968s" podCreationTimestamp="2025-12-09 03:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:42.68511969 +0000 UTC m=+1304.394425136" watchObservedRunningTime="2025-12-09 03:33:42.687642968 +0000 UTC m=+1304.396948394" Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.706718 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9417-account-create-update-59kg8" podStartSLOduration=2.7066989599999998 podStartE2EDuration="2.70669896s" podCreationTimestamp="2025-12-09 03:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:42.70186166 +0000 UTC m=+1304.411167086" watchObservedRunningTime="2025-12-09 03:33:42.70669896 +0000 UTC m=+1304.416004386" Dec 09 03:33:42 crc kubenswrapper[4766]: I1209 03:33:42.724754 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-67ng6" podStartSLOduration=2.724728955 podStartE2EDuration="2.724728955s" podCreationTimestamp="2025-12-09 03:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:42.715000804 +0000 UTC m=+1304.424306230" watchObservedRunningTime="2025-12-09 03:33:42.724728955 +0000 UTC m=+1304.434034381" Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.653522 4766 generic.go:334] "Generic (PLEG): container finished" podID="85c956d0-404a-4be0-a2ba-27706be74ec0" containerID="0b4f54ea8d642e05e2a555795cff1dc74694cf421659e3f43efd44714ff34125" exitCode=0 Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.653581 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9417-account-create-update-59kg8" event={"ID":"85c956d0-404a-4be0-a2ba-27706be74ec0","Type":"ContainerDied","Data":"0b4f54ea8d642e05e2a555795cff1dc74694cf421659e3f43efd44714ff34125"} Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.655537 4766 generic.go:334] "Generic (PLEG): container finished" podID="8ef5df73-cf11-4fae-bd6f-e91cc2f767aa" containerID="df2580fc6a7c68c79a9fa23dde958abfe360c1d442a6396b5b94f67428f72619" exitCode=0 Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.655592 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-67ng6" event={"ID":"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa","Type":"ContainerDied","Data":"df2580fc6a7c68c79a9fa23dde958abfe360c1d442a6396b5b94f67428f72619"} Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.656876 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b1cdb70-62c7-4d17-8c11-4f9f40fe8032" containerID="24729679fd0dcc450517c6906dbe60f8c99be4396dcda6ec1d41ff11e94c6452" exitCode=0 Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.657051 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2ljqg" event={"ID":"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032","Type":"ContainerDied","Data":"24729679fd0dcc450517c6906dbe60f8c99be4396dcda6ec1d41ff11e94c6452"} Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.658655 4766 generic.go:334] "Generic (PLEG): container finished" podID="41a4873b-e4c4-4284-97a6-baa8346e4849" containerID="2d3338e237ba7af89e198aa31cd1ddc3d0b533524dcf30eb8fa1547e1b318aaf" exitCode=0 Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.658705 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-49e7-account-create-update-ctlzz" event={"ID":"41a4873b-e4c4-4284-97a6-baa8346e4849","Type":"ContainerDied","Data":"2d3338e237ba7af89e198aa31cd1ddc3d0b533524dcf30eb8fa1547e1b318aaf"} Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.663933 4766 generic.go:334] "Generic (PLEG): container finished" podID="6e65ef84-392e-49dd-8f6d-ed4562e8524a" containerID="4f3f6e014efa4d6dfba0dea7aea4893fdb2153914420f7a8f8e5fd012360ef1f" exitCode=0 Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.664024 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0abc-account-create-update-dxlcc" event={"ID":"6e65ef84-392e-49dd-8f6d-ed4562e8524a","Type":"ContainerDied","Data":"4f3f6e014efa4d6dfba0dea7aea4893fdb2153914420f7a8f8e5fd012360ef1f"} Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.666283 4766 generic.go:334] "Generic (PLEG): container finished" podID="d66f5276-1f41-4add-b998-70930f40b6a7" containerID="7e3d89ca9056493b6e6e8ac9b33bcad91da4a6aaccec7f99bacb242855bb6e95" exitCode=0 Dec 09 03:33:43 crc kubenswrapper[4766]: I1209 03:33:43.666349 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cqs82" event={"ID":"d66f5276-1f41-4add-b998-70930f40b6a7","Type":"ContainerDied","Data":"7e3d89ca9056493b6e6e8ac9b33bcad91da4a6aaccec7f99bacb242855bb6e95"} Dec 09 03:33:44 crc kubenswrapper[4766]: I1209 03:33:44.676817 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f"} Dec 09 03:33:44 crc kubenswrapper[4766]: I1209 03:33:44.677190 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7"} Dec 09 03:33:44 crc kubenswrapper[4766]: I1209 03:33:44.677207 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14"} Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.090150 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.164653 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-operator-scripts\") pod \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\" (UID: \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.164801 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjcmx\" (UniqueName: \"kubernetes.io/projected/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-kube-api-access-sjcmx\") pod \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\" (UID: \"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.165930 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ef5df73-cf11-4fae-bd6f-e91cc2f767aa" (UID: "8ef5df73-cf11-4fae-bd6f-e91cc2f767aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.171003 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-kube-api-access-sjcmx" (OuterVolumeSpecName: "kube-api-access-sjcmx") pod "8ef5df73-cf11-4fae-bd6f-e91cc2f767aa" (UID: "8ef5df73-cf11-4fae-bd6f-e91cc2f767aa"). InnerVolumeSpecName "kube-api-access-sjcmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.267182 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjcmx\" (UniqueName: \"kubernetes.io/projected/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-kube-api-access-sjcmx\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.267237 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.284456 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.291008 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.297618 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.311770 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.324087 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cqs82" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.370064 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41a4873b-e4c4-4284-97a6-baa8346e4849-operator-scripts\") pod \"41a4873b-e4c4-4284-97a6-baa8346e4849\" (UID: \"41a4873b-e4c4-4284-97a6-baa8346e4849\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.370150 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r622g\" (UniqueName: \"kubernetes.io/projected/41a4873b-e4c4-4284-97a6-baa8346e4849-kube-api-access-r622g\") pod \"41a4873b-e4c4-4284-97a6-baa8346e4849\" (UID: \"41a4873b-e4c4-4284-97a6-baa8346e4849\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.370247 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2w65\" (UniqueName: \"kubernetes.io/projected/85c956d0-404a-4be0-a2ba-27706be74ec0-kube-api-access-j2w65\") pod \"85c956d0-404a-4be0-a2ba-27706be74ec0\" (UID: \"85c956d0-404a-4be0-a2ba-27706be74ec0\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.370280 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-operator-scripts\") pod \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\" (UID: \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.370396 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85c956d0-404a-4be0-a2ba-27706be74ec0-operator-scripts\") pod \"85c956d0-404a-4be0-a2ba-27706be74ec0\" (UID: \"85c956d0-404a-4be0-a2ba-27706be74ec0\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.370433 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m5nc\" (UniqueName: \"kubernetes.io/projected/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-kube-api-access-2m5nc\") pod \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\" (UID: \"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.384710 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b1cdb70-62c7-4d17-8c11-4f9f40fe8032" (UID: "6b1cdb70-62c7-4d17-8c11-4f9f40fe8032"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.385136 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a4873b-e4c4-4284-97a6-baa8346e4849-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41a4873b-e4c4-4284-97a6-baa8346e4849" (UID: "41a4873b-e4c4-4284-97a6-baa8346e4849"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.386641 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85c956d0-404a-4be0-a2ba-27706be74ec0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85c956d0-404a-4be0-a2ba-27706be74ec0" (UID: "85c956d0-404a-4be0-a2ba-27706be74ec0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.390173 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a4873b-e4c4-4284-97a6-baa8346e4849-kube-api-access-r622g" (OuterVolumeSpecName: "kube-api-access-r622g") pod "41a4873b-e4c4-4284-97a6-baa8346e4849" (UID: "41a4873b-e4c4-4284-97a6-baa8346e4849"). InnerVolumeSpecName "kube-api-access-r622g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.399404 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-kube-api-access-2m5nc" (OuterVolumeSpecName: "kube-api-access-2m5nc") pod "6b1cdb70-62c7-4d17-8c11-4f9f40fe8032" (UID: "6b1cdb70-62c7-4d17-8c11-4f9f40fe8032"). InnerVolumeSpecName "kube-api-access-2m5nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.400332 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c956d0-404a-4be0-a2ba-27706be74ec0-kube-api-access-j2w65" (OuterVolumeSpecName: "kube-api-access-j2w65") pod "85c956d0-404a-4be0-a2ba-27706be74ec0" (UID: "85c956d0-404a-4be0-a2ba-27706be74ec0"). InnerVolumeSpecName "kube-api-access-j2w65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.472477 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e65ef84-392e-49dd-8f6d-ed4562e8524a-operator-scripts\") pod \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\" (UID: \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.472693 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qgnn\" (UniqueName: \"kubernetes.io/projected/6e65ef84-392e-49dd-8f6d-ed4562e8524a-kube-api-access-9qgnn\") pod \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\" (UID: \"6e65ef84-392e-49dd-8f6d-ed4562e8524a\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.472736 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2528\" (UniqueName: \"kubernetes.io/projected/d66f5276-1f41-4add-b998-70930f40b6a7-kube-api-access-v2528\") pod \"d66f5276-1f41-4add-b998-70930f40b6a7\" (UID: \"d66f5276-1f41-4add-b998-70930f40b6a7\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.472780 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d66f5276-1f41-4add-b998-70930f40b6a7-operator-scripts\") pod \"d66f5276-1f41-4add-b998-70930f40b6a7\" (UID: \"d66f5276-1f41-4add-b998-70930f40b6a7\") " Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.473077 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e65ef84-392e-49dd-8f6d-ed4562e8524a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e65ef84-392e-49dd-8f6d-ed4562e8524a" (UID: "6e65ef84-392e-49dd-8f6d-ed4562e8524a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.473256 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85c956d0-404a-4be0-a2ba-27706be74ec0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.473286 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m5nc\" (UniqueName: \"kubernetes.io/projected/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-kube-api-access-2m5nc\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.473301 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41a4873b-e4c4-4284-97a6-baa8346e4849-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.473314 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r622g\" (UniqueName: \"kubernetes.io/projected/41a4873b-e4c4-4284-97a6-baa8346e4849-kube-api-access-r622g\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.473327 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2w65\" (UniqueName: \"kubernetes.io/projected/85c956d0-404a-4be0-a2ba-27706be74ec0-kube-api-access-j2w65\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.473339 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.473785 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d66f5276-1f41-4add-b998-70930f40b6a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d66f5276-1f41-4add-b998-70930f40b6a7" (UID: "d66f5276-1f41-4add-b998-70930f40b6a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.475583 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e65ef84-392e-49dd-8f6d-ed4562e8524a-kube-api-access-9qgnn" (OuterVolumeSpecName: "kube-api-access-9qgnn") pod "6e65ef84-392e-49dd-8f6d-ed4562e8524a" (UID: "6e65ef84-392e-49dd-8f6d-ed4562e8524a"). InnerVolumeSpecName "kube-api-access-9qgnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.475767 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66f5276-1f41-4add-b998-70930f40b6a7-kube-api-access-v2528" (OuterVolumeSpecName: "kube-api-access-v2528") pod "d66f5276-1f41-4add-b998-70930f40b6a7" (UID: "d66f5276-1f41-4add-b998-70930f40b6a7"). InnerVolumeSpecName "kube-api-access-v2528". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.574533 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2528\" (UniqueName: \"kubernetes.io/projected/d66f5276-1f41-4add-b998-70930f40b6a7-kube-api-access-v2528\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.574582 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d66f5276-1f41-4add-b998-70930f40b6a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.574596 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e65ef84-392e-49dd-8f6d-ed4562e8524a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.574607 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qgnn\" (UniqueName: \"kubernetes.io/projected/6e65ef84-392e-49dd-8f6d-ed4562e8524a-kube-api-access-9qgnn\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.688202 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9417-account-create-update-59kg8" event={"ID":"85c956d0-404a-4be0-a2ba-27706be74ec0","Type":"ContainerDied","Data":"cc1a91f42344279bc68a7f51e4d5719577b34689a6c27ef7bacb84424646701c"} Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.688315 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc1a91f42344279bc68a7f51e4d5719577b34689a6c27ef7bacb84424646701c" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.688277 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9417-account-create-update-59kg8" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.689548 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-67ng6" event={"ID":"8ef5df73-cf11-4fae-bd6f-e91cc2f767aa","Type":"ContainerDied","Data":"2e5708243197fb1fe18646e47f2d07b4e1d56794dac5e13c6ee29d5d67f07360"} Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.689575 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e5708243197fb1fe18646e47f2d07b4e1d56794dac5e13c6ee29d5d67f07360" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.689665 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-67ng6" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.691544 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2ljqg" event={"ID":"6b1cdb70-62c7-4d17-8c11-4f9f40fe8032","Type":"ContainerDied","Data":"ef6d35adcb65405d75e0cd0cea049c9109221606957ad2352d6ec1131167d97a"} Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.691581 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef6d35adcb65405d75e0cd0cea049c9109221606957ad2352d6ec1131167d97a" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.691824 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2ljqg" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.692854 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-49e7-account-create-update-ctlzz" event={"ID":"41a4873b-e4c4-4284-97a6-baa8346e4849","Type":"ContainerDied","Data":"a0015a46ce0325dd355eaaeb28f99db924598f5440bc2bae63d45b9484a2d412"} Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.692896 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0015a46ce0325dd355eaaeb28f99db924598f5440bc2bae63d45b9484a2d412" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.692977 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-49e7-account-create-update-ctlzz" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.695762 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0abc-account-create-update-dxlcc" event={"ID":"6e65ef84-392e-49dd-8f6d-ed4562e8524a","Type":"ContainerDied","Data":"6b11d4a1aab91f3d6852a5860f4119051783591d0a11d31526f6e8d73834917f"} Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.695790 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b11d4a1aab91f3d6852a5860f4119051783591d0a11d31526f6e8d73834917f" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.695854 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0abc-account-create-update-dxlcc" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.699471 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cqs82" event={"ID":"d66f5276-1f41-4add-b998-70930f40b6a7","Type":"ContainerDied","Data":"170a8aa89bb03063fe3f32fe8c4a3c0ff911a6b2d669976e2b107060a6a690cb"} Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.699879 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="170a8aa89bb03063fe3f32fe8c4a3c0ff911a6b2d669976e2b107060a6a690cb" Dec 09 03:33:45 crc kubenswrapper[4766]: I1209 03:33:45.699931 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cqs82" Dec 09 03:33:50 crc kubenswrapper[4766]: I1209 03:33:50.751319 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225"} Dec 09 03:33:50 crc kubenswrapper[4766]: I1209 03:33:50.752696 4766 generic.go:334] "Generic (PLEG): container finished" podID="48862672-08e2-4ac6-86a3-57d84bbc868d" containerID="953d3d0d70a4b3d2cd1bec6f32e0bdb69cc885c56c1891b2288ff895a80a5c71" exitCode=0 Dec 09 03:33:50 crc kubenswrapper[4766]: I1209 03:33:50.752764 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48862672-08e2-4ac6-86a3-57d84bbc868d","Type":"ContainerDied","Data":"953d3d0d70a4b3d2cd1bec6f32e0bdb69cc885c56c1891b2288ff895a80a5c71"} Dec 09 03:33:50 crc kubenswrapper[4766]: I1209 03:33:50.754811 4766 generic.go:334] "Generic (PLEG): container finished" podID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" containerID="89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238" exitCode=0 Dec 09 03:33:50 crc kubenswrapper[4766]: I1209 03:33:50.754885 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3af438c1-d0b9-4ecb-bb88-a0efd14736a4","Type":"ContainerDied","Data":"89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238"} Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.141020 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-bkt59"] Dec 09 03:33:51 crc kubenswrapper[4766]: E1209 03:33:51.141708 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a4873b-e4c4-4284-97a6-baa8346e4849" containerName="mariadb-account-create-update" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.141728 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a4873b-e4c4-4284-97a6-baa8346e4849" containerName="mariadb-account-create-update" Dec 09 03:33:51 crc kubenswrapper[4766]: E1209 03:33:51.141749 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66f5276-1f41-4add-b998-70930f40b6a7" containerName="mariadb-database-create" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.141757 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66f5276-1f41-4add-b998-70930f40b6a7" containerName="mariadb-database-create" Dec 09 03:33:51 crc kubenswrapper[4766]: E1209 03:33:51.141779 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e65ef84-392e-49dd-8f6d-ed4562e8524a" containerName="mariadb-account-create-update" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.141789 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e65ef84-392e-49dd-8f6d-ed4562e8524a" containerName="mariadb-account-create-update" Dec 09 03:33:51 crc kubenswrapper[4766]: E1209 03:33:51.141810 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c956d0-404a-4be0-a2ba-27706be74ec0" containerName="mariadb-account-create-update" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.141817 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c956d0-404a-4be0-a2ba-27706be74ec0" containerName="mariadb-account-create-update" Dec 09 03:33:51 crc kubenswrapper[4766]: E1209 03:33:51.141830 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef5df73-cf11-4fae-bd6f-e91cc2f767aa" containerName="mariadb-database-create" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.141837 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef5df73-cf11-4fae-bd6f-e91cc2f767aa" containerName="mariadb-database-create" Dec 09 03:33:51 crc kubenswrapper[4766]: E1209 03:33:51.141858 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1cdb70-62c7-4d17-8c11-4f9f40fe8032" containerName="mariadb-database-create" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.141866 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1cdb70-62c7-4d17-8c11-4f9f40fe8032" containerName="mariadb-database-create" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.142075 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef5df73-cf11-4fae-bd6f-e91cc2f767aa" containerName="mariadb-database-create" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.142114 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a4873b-e4c4-4284-97a6-baa8346e4849" containerName="mariadb-account-create-update" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.142140 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c956d0-404a-4be0-a2ba-27706be74ec0" containerName="mariadb-account-create-update" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.142156 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e65ef84-392e-49dd-8f6d-ed4562e8524a" containerName="mariadb-account-create-update" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.142175 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1cdb70-62c7-4d17-8c11-4f9f40fe8032" containerName="mariadb-database-create" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.142192 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66f5276-1f41-4add-b998-70930f40b6a7" containerName="mariadb-database-create" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.142843 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.145925 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.146196 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hqfv8" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.154006 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bkt59"] Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.282928 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-config-data\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.282982 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-db-sync-config-data\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.283071 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjhcl\" (UniqueName: \"kubernetes.io/projected/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-kube-api-access-tjhcl\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.283121 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-combined-ca-bundle\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.390525 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-combined-ca-bundle\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.396007 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-combined-ca-bundle\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.398389 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-config-data\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.398484 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-db-sync-config-data\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.398668 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjhcl\" (UniqueName: \"kubernetes.io/projected/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-kube-api-access-tjhcl\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.405493 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-db-sync-config-data\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.407667 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-config-data\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.427202 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjhcl\" (UniqueName: \"kubernetes.io/projected/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-kube-api-access-tjhcl\") pod \"glance-db-sync-bkt59\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.463984 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bkt59" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.701870 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s25fz" podUID="f28c984f-04eb-4398-af98-9e2c5e6afd13" containerName="ovn-controller" probeResult="failure" output=< Dec 09 03:33:51 crc kubenswrapper[4766]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 09 03:33:51 crc kubenswrapper[4766]: > Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.713003 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.734601 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.784622 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48862672-08e2-4ac6-86a3-57d84bbc868d","Type":"ContainerStarted","Data":"99fdba038c7e2b9ae61670d79759f6a9042c470a8ddf322cdd4ad3f61ffc2528"} Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.785712 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.794835 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3af438c1-d0b9-4ecb-bb88-a0efd14736a4","Type":"ContainerStarted","Data":"962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18"} Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.795267 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.813033 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.596966036 podStartE2EDuration="1m15.81301768s" podCreationTimestamp="2025-12-09 03:32:36 +0000 UTC" firstStartedPulling="2025-12-09 03:32:39.591906652 +0000 UTC m=+1241.301212078" lastFinishedPulling="2025-12-09 03:33:13.807958286 +0000 UTC m=+1275.517263722" observedRunningTime="2025-12-09 03:33:51.806975188 +0000 UTC m=+1313.516280634" watchObservedRunningTime="2025-12-09 03:33:51.81301768 +0000 UTC m=+1313.522323107" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.848961 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371960.00583 podStartE2EDuration="1m16.848944787s" podCreationTimestamp="2025-12-09 03:32:35 +0000 UTC" firstStartedPulling="2025-12-09 03:32:39.533298616 +0000 UTC m=+1241.242604042" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:33:51.843389797 +0000 UTC m=+1313.552695223" watchObservedRunningTime="2025-12-09 03:33:51.848944787 +0000 UTC m=+1313.558250223" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.992134 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s25fz-config-pbhgt"] Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.993714 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:51 crc kubenswrapper[4766]: I1209 03:33:51.996191 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.002518 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s25fz-config-pbhgt"] Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.114360 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-log-ovn\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.114561 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-additional-scripts\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.114742 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run-ovn\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.114977 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.115032 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-scripts\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.115116 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5hq9\" (UniqueName: \"kubernetes.io/projected/36fd8788-877f-4af5-a167-0d700accae32-kube-api-access-s5hq9\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.120892 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bkt59"] Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.216953 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5hq9\" (UniqueName: \"kubernetes.io/projected/36fd8788-877f-4af5-a167-0d700accae32-kube-api-access-s5hq9\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.217063 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-log-ovn\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.217124 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-additional-scripts\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.217156 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run-ovn\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.217257 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.217297 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-scripts\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.217431 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-log-ovn\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.217480 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run-ovn\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.217539 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.218183 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-additional-scripts\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.219610 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-scripts\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.238154 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5hq9\" (UniqueName: \"kubernetes.io/projected/36fd8788-877f-4af5-a167-0d700accae32-kube-api-access-s5hq9\") pod \"ovn-controller-s25fz-config-pbhgt\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.326479 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.773297 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s25fz-config-pbhgt"] Dec 09 03:33:52 crc kubenswrapper[4766]: W1209 03:33:52.779061 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36fd8788_877f_4af5_a167_0d700accae32.slice/crio-86112ca88eb7ecdb8770cb498f24588b4abf806f38083517075d6ec93694adbc WatchSource:0}: Error finding container 86112ca88eb7ecdb8770cb498f24588b4abf806f38083517075d6ec93694adbc: Status 404 returned error can't find the container with id 86112ca88eb7ecdb8770cb498f24588b4abf806f38083517075d6ec93694adbc Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.817679 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7"} Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.818602 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz-config-pbhgt" event={"ID":"36fd8788-877f-4af5-a167-0d700accae32","Type":"ContainerStarted","Data":"86112ca88eb7ecdb8770cb498f24588b4abf806f38083517075d6ec93694adbc"} Dec 09 03:33:52 crc kubenswrapper[4766]: I1209 03:33:52.819988 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bkt59" event={"ID":"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a","Type":"ContainerStarted","Data":"c892acb5b8f99d58769c30fbec9c89ed27ac4e322dd2864571e40ddcb36832f2"} Dec 09 03:33:53 crc kubenswrapper[4766]: I1209 03:33:53.845529 4766 generic.go:334] "Generic (PLEG): container finished" podID="36fd8788-877f-4af5-a167-0d700accae32" containerID="c05a6f7c85a947116735fa7ade874110dcd927f792cdf1a18a3bb6036f564a48" exitCode=0 Dec 09 03:33:53 crc kubenswrapper[4766]: I1209 03:33:53.845717 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz-config-pbhgt" event={"ID":"36fd8788-877f-4af5-a167-0d700accae32","Type":"ContainerDied","Data":"c05a6f7c85a947116735fa7ade874110dcd927f792cdf1a18a3bb6036f564a48"} Dec 09 03:33:53 crc kubenswrapper[4766]: I1209 03:33:53.853099 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73"} Dec 09 03:33:53 crc kubenswrapper[4766]: I1209 03:33:53.853144 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac"} Dec 09 03:33:53 crc kubenswrapper[4766]: I1209 03:33:53.853153 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba"} Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.661961 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.780206 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-additional-scripts\") pod \"36fd8788-877f-4af5-a167-0d700accae32\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.780350 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run-ovn\") pod \"36fd8788-877f-4af5-a167-0d700accae32\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.780430 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-log-ovn\") pod \"36fd8788-877f-4af5-a167-0d700accae32\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.780489 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-scripts\") pod \"36fd8788-877f-4af5-a167-0d700accae32\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.780529 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run\") pod \"36fd8788-877f-4af5-a167-0d700accae32\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.780552 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5hq9\" (UniqueName: \"kubernetes.io/projected/36fd8788-877f-4af5-a167-0d700accae32-kube-api-access-s5hq9\") pod \"36fd8788-877f-4af5-a167-0d700accae32\" (UID: \"36fd8788-877f-4af5-a167-0d700accae32\") " Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.781818 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "36fd8788-877f-4af5-a167-0d700accae32" (UID: "36fd8788-877f-4af5-a167-0d700accae32"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.781839 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "36fd8788-877f-4af5-a167-0d700accae32" (UID: "36fd8788-877f-4af5-a167-0d700accae32"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.781860 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "36fd8788-877f-4af5-a167-0d700accae32" (UID: "36fd8788-877f-4af5-a167-0d700accae32"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.781886 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run" (OuterVolumeSpecName: "var-run") pod "36fd8788-877f-4af5-a167-0d700accae32" (UID: "36fd8788-877f-4af5-a167-0d700accae32"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.782684 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-scripts" (OuterVolumeSpecName: "scripts") pod "36fd8788-877f-4af5-a167-0d700accae32" (UID: "36fd8788-877f-4af5-a167-0d700accae32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.799102 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fd8788-877f-4af5-a167-0d700accae32-kube-api-access-s5hq9" (OuterVolumeSpecName: "kube-api-access-s5hq9") pod "36fd8788-877f-4af5-a167-0d700accae32" (UID: "36fd8788-877f-4af5-a167-0d700accae32"). InnerVolumeSpecName "kube-api-access-s5hq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.874148 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz-config-pbhgt" event={"ID":"36fd8788-877f-4af5-a167-0d700accae32","Type":"ContainerDied","Data":"86112ca88eb7ecdb8770cb498f24588b4abf806f38083517075d6ec93694adbc"} Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.874203 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86112ca88eb7ecdb8770cb498f24588b4abf806f38083517075d6ec93694adbc" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.874316 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz-config-pbhgt" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.882631 4766 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.882658 4766 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.882667 4766 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.882677 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36fd8788-877f-4af5-a167-0d700accae32-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.882711 4766 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36fd8788-877f-4af5-a167-0d700accae32-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.882720 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5hq9\" (UniqueName: \"kubernetes.io/projected/36fd8788-877f-4af5-a167-0d700accae32-kube-api-access-s5hq9\") on node \"crc\" DevicePath \"\"" Dec 09 03:33:55 crc kubenswrapper[4766]: I1209 03:33:55.891659 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea"} Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.800831 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-s25fz" Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.820377 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s25fz-config-pbhgt"] Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.831819 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s25fz-config-pbhgt"] Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.851081 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fd8788-877f-4af5-a167-0d700accae32" path="/var/lib/kubelet/pods/36fd8788-877f-4af5-a167-0d700accae32/volumes" Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.903367 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s25fz-config-qrtj6"] Dec 09 03:33:56 crc kubenswrapper[4766]: E1209 03:33:56.903711 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fd8788-877f-4af5-a167-0d700accae32" containerName="ovn-config" Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.903727 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fd8788-877f-4af5-a167-0d700accae32" containerName="ovn-config" Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.903891 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fd8788-877f-4af5-a167-0d700accae32" containerName="ovn-config" Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.904560 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.907830 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.910684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c"} Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.910725 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b"} Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.910739 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527"} Dec 09 03:33:56 crc kubenswrapper[4766]: I1209 03:33:56.913337 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s25fz-config-qrtj6"] Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.000172 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46n4g\" (UniqueName: \"kubernetes.io/projected/4c206fd0-91da-4604-9cfc-0529ae04bd18-kube-api-access-46n4g\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.000393 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-scripts\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.000431 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-log-ovn\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.000454 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run-ovn\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.000774 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-additional-scripts\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.000799 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.102416 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-scripts\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.102455 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-log-ovn\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.102471 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run-ovn\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.102498 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.102515 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-additional-scripts\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.102544 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46n4g\" (UniqueName: \"kubernetes.io/projected/4c206fd0-91da-4604-9cfc-0529ae04bd18-kube-api-access-46n4g\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.102813 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.102828 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run-ovn\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.102863 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-log-ovn\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.103491 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-additional-scripts\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.104793 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-scripts\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.128435 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46n4g\" (UniqueName: \"kubernetes.io/projected/4c206fd0-91da-4604-9cfc-0529ae04bd18-kube-api-access-46n4g\") pod \"ovn-controller-s25fz-config-qrtj6\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.226338 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.740357 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s25fz-config-qrtj6"] Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.935200 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2"} Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.935269 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f"} Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.935283 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerStarted","Data":"e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74"} Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.936725 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz-config-qrtj6" event={"ID":"4c206fd0-91da-4604-9cfc-0529ae04bd18","Type":"ContainerStarted","Data":"e172831facf5642d3aa72bcc292688b42bae244386e9dd022d57b86e459bf855"} Dec 09 03:33:57 crc kubenswrapper[4766]: I1209 03:33:57.972895 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.26644075 podStartE2EDuration="35.972876382s" podCreationTimestamp="2025-12-09 03:33:22 +0000 UTC" firstStartedPulling="2025-12-09 03:33:39.857902646 +0000 UTC m=+1301.567208072" lastFinishedPulling="2025-12-09 03:33:55.564338278 +0000 UTC m=+1317.273643704" observedRunningTime="2025-12-09 03:33:57.966931192 +0000 UTC m=+1319.676236618" watchObservedRunningTime="2025-12-09 03:33:57.972876382 +0000 UTC m=+1319.682181818" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.249375 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2bnr"] Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.251010 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.253801 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.259080 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2bnr"] Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.319525 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr6j9\" (UniqueName: \"kubernetes.io/projected/ba8836f7-6293-4cfc-a615-5c2517576d06-kube-api-access-nr6j9\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.319605 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.319670 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.319718 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-config\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.319753 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.319779 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.420884 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-config\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.420943 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.420973 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.421012 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr6j9\" (UniqueName: \"kubernetes.io/projected/ba8836f7-6293-4cfc-a615-5c2517576d06-kube-api-access-nr6j9\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.421052 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.421078 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.421699 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-config\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.421763 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.421804 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.422238 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.422550 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.443534 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr6j9\" (UniqueName: \"kubernetes.io/projected/ba8836f7-6293-4cfc-a615-5c2517576d06-kube-api-access-nr6j9\") pod \"dnsmasq-dns-77585f5f8c-x2bnr\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.588699 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.948927 4766 generic.go:334] "Generic (PLEG): container finished" podID="4c206fd0-91da-4604-9cfc-0529ae04bd18" containerID="5e424dade2953e128931f7283ccbc4c6fecc4b4958d07c505678a5238649dfb3" exitCode=0 Dec 09 03:33:58 crc kubenswrapper[4766]: I1209 03:33:58.948989 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz-config-qrtj6" event={"ID":"4c206fd0-91da-4604-9cfc-0529ae04bd18","Type":"ContainerDied","Data":"5e424dade2953e128931f7283ccbc4c6fecc4b4958d07c505678a5238649dfb3"} Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.697678 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785073 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-log-ovn\") pod \"4c206fd0-91da-4604-9cfc-0529ae04bd18\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785144 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-additional-scripts\") pod \"4c206fd0-91da-4604-9cfc-0529ae04bd18\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785164 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run-ovn\") pod \"4c206fd0-91da-4604-9cfc-0529ae04bd18\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785199 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-scripts\") pod \"4c206fd0-91da-4604-9cfc-0529ae04bd18\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785253 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46n4g\" (UniqueName: \"kubernetes.io/projected/4c206fd0-91da-4604-9cfc-0529ae04bd18-kube-api-access-46n4g\") pod \"4c206fd0-91da-4604-9cfc-0529ae04bd18\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785270 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run\") pod \"4c206fd0-91da-4604-9cfc-0529ae04bd18\" (UID: \"4c206fd0-91da-4604-9cfc-0529ae04bd18\") " Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785369 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4c206fd0-91da-4604-9cfc-0529ae04bd18" (UID: "4c206fd0-91da-4604-9cfc-0529ae04bd18"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785397 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4c206fd0-91da-4604-9cfc-0529ae04bd18" (UID: "4c206fd0-91da-4604-9cfc-0529ae04bd18"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785524 4766 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785538 4766 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.785977 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4c206fd0-91da-4604-9cfc-0529ae04bd18" (UID: "4c206fd0-91da-4604-9cfc-0529ae04bd18"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.786189 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run" (OuterVolumeSpecName: "var-run") pod "4c206fd0-91da-4604-9cfc-0529ae04bd18" (UID: "4c206fd0-91da-4604-9cfc-0529ae04bd18"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.786174 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-scripts" (OuterVolumeSpecName: "scripts") pod "4c206fd0-91da-4604-9cfc-0529ae04bd18" (UID: "4c206fd0-91da-4604-9cfc-0529ae04bd18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.790722 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c206fd0-91da-4604-9cfc-0529ae04bd18-kube-api-access-46n4g" (OuterVolumeSpecName: "kube-api-access-46n4g") pod "4c206fd0-91da-4604-9cfc-0529ae04bd18" (UID: "4c206fd0-91da-4604-9cfc-0529ae04bd18"). InnerVolumeSpecName "kube-api-access-46n4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.886636 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.886671 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46n4g\" (UniqueName: \"kubernetes.io/projected/4c206fd0-91da-4604-9cfc-0529ae04bd18-kube-api-access-46n4g\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.886687 4766 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4c206fd0-91da-4604-9cfc-0529ae04bd18-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:06 crc kubenswrapper[4766]: I1209 03:34:06.886699 4766 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4c206fd0-91da-4604-9cfc-0529ae04bd18-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.008363 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2bnr"] Dec 09 03:34:07 crc kubenswrapper[4766]: W1209 03:34:07.030655 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba8836f7_6293_4cfc_a615_5c2517576d06.slice/crio-bc1208b2eb4ef09dc7aa15f8a80aa7677e7024e330e34bd3968941c309667913 WatchSource:0}: Error finding container bc1208b2eb4ef09dc7aa15f8a80aa7677e7024e330e34bd3968941c309667913: Status 404 returned error can't find the container with id bc1208b2eb4ef09dc7aa15f8a80aa7677e7024e330e34bd3968941c309667913 Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.035905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz-config-qrtj6" event={"ID":"4c206fd0-91da-4604-9cfc-0529ae04bd18","Type":"ContainerDied","Data":"e172831facf5642d3aa72bcc292688b42bae244386e9dd022d57b86e459bf855"} Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.035935 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz-config-qrtj6" Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.035940 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e172831facf5642d3aa72bcc292688b42bae244386e9dd022d57b86e459bf855" Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.317012 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.317076 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.317122 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.317871 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86ed0df3c1c9e8bf00e2182a8d0f2c1317600b335e2702ed7b54e17bf114fc74"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.317943 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://86ed0df3c1c9e8bf00e2182a8d0f2c1317600b335e2702ed7b54e17bf114fc74" gracePeriod=600 Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.774588 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s25fz-config-qrtj6"] Dec 09 03:34:07 crc kubenswrapper[4766]: I1209 03:34:07.784742 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s25fz-config-qrtj6"] Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.043792 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerID="e48de07fac1174b5ff1df9ce6e893e8d72e10dac7ce267a9df19c35ce4af1d7f" exitCode=0 Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.043867 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" event={"ID":"ba8836f7-6293-4cfc-a615-5c2517576d06","Type":"ContainerDied","Data":"e48de07fac1174b5ff1df9ce6e893e8d72e10dac7ce267a9df19c35ce4af1d7f"} Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.044339 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" event={"ID":"ba8836f7-6293-4cfc-a615-5c2517576d06","Type":"ContainerStarted","Data":"bc1208b2eb4ef09dc7aa15f8a80aa7677e7024e330e34bd3968941c309667913"} Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.046139 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bkt59" event={"ID":"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a","Type":"ContainerStarted","Data":"797200790ce6d3aaa2d627d061ab0d04ee01f885e7da6724a44b88b836f80f26"} Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.051159 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="86ed0df3c1c9e8bf00e2182a8d0f2c1317600b335e2702ed7b54e17bf114fc74" exitCode=0 Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.051206 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"86ed0df3c1c9e8bf00e2182a8d0f2c1317600b335e2702ed7b54e17bf114fc74"} Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.051252 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917"} Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.051271 4766 scope.go:117] "RemoveContainer" containerID="6a0a8a7bff8971534685d42f33b8fda759b536f0edd8ff1b38fb2ef750399fde" Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.118921 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-bkt59" podStartSLOduration=2.627979162 podStartE2EDuration="17.118898236s" podCreationTimestamp="2025-12-09 03:33:51 +0000 UTC" firstStartedPulling="2025-12-09 03:33:52.122990057 +0000 UTC m=+1313.832295483" lastFinishedPulling="2025-12-09 03:34:06.613909141 +0000 UTC m=+1328.323214557" observedRunningTime="2025-12-09 03:34:08.108343752 +0000 UTC m=+1329.817649178" watchObservedRunningTime="2025-12-09 03:34:08.118898236 +0000 UTC m=+1329.828203672" Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.848540 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c206fd0-91da-4604-9cfc-0529ae04bd18" path="/var/lib/kubelet/pods/4c206fd0-91da-4604-9cfc-0529ae04bd18/volumes" Dec 09 03:34:08 crc kubenswrapper[4766]: I1209 03:34:08.992519 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:34:09 crc kubenswrapper[4766]: I1209 03:34:09.015510 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 09 03:34:09 crc kubenswrapper[4766]: I1209 03:34:09.106334 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" event={"ID":"ba8836f7-6293-4cfc-a615-5c2517576d06","Type":"ContainerStarted","Data":"d22e14114255fd970a5ecfdbdacf22b9307b534b5656bbb1c433a44410595da8"} Dec 09 03:34:09 crc kubenswrapper[4766]: I1209 03:34:09.106932 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:34:09 crc kubenswrapper[4766]: I1209 03:34:09.142161 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" podStartSLOduration=11.142138974 podStartE2EDuration="11.142138974s" podCreationTimestamp="2025-12-09 03:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:09.133638506 +0000 UTC m=+1330.842943932" watchObservedRunningTime="2025-12-09 03:34:09.142138974 +0000 UTC m=+1330.851444400" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.797349 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-m4l8d"] Dec 09 03:34:10 crc kubenswrapper[4766]: E1209 03:34:10.797998 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c206fd0-91da-4604-9cfc-0529ae04bd18" containerName="ovn-config" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.798016 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c206fd0-91da-4604-9cfc-0529ae04bd18" containerName="ovn-config" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.798232 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c206fd0-91da-4604-9cfc-0529ae04bd18" containerName="ovn-config" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.798907 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.805644 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-m4l8d"] Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.895954 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d6a3-account-create-update-jllzv"] Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.908291 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d6a3-account-create-update-jllzv"] Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.908423 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.913552 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.927835 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vgqln"] Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.928895 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.947604 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vgqln"] Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.961717 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rsr\" (UniqueName: \"kubernetes.io/projected/ba577939-09d4-40a6-b1e7-98f607984111-kube-api-access-j6rsr\") pod \"cinder-db-create-m4l8d\" (UID: \"ba577939-09d4-40a6-b1e7-98f607984111\") " pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.962078 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba577939-09d4-40a6-b1e7-98f607984111-operator-scripts\") pod \"cinder-db-create-m4l8d\" (UID: \"ba577939-09d4-40a6-b1e7-98f607984111\") " pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.964242 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-da2b-account-create-update-pq9nx"] Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.965323 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.967334 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 09 03:34:10 crc kubenswrapper[4766]: I1209 03:34:10.992959 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-da2b-account-create-update-pq9nx"] Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.063827 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba577939-09d4-40a6-b1e7-98f607984111-operator-scripts\") pod \"cinder-db-create-m4l8d\" (UID: \"ba577939-09d4-40a6-b1e7-98f607984111\") " pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.063883 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74a4954a-fbea-4371-bfd0-cb7681daa75e-operator-scripts\") pod \"cinder-da2b-account-create-update-pq9nx\" (UID: \"74a4954a-fbea-4371-bfd0-cb7681daa75e\") " pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.063924 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/67a711ad-556a-4ead-a674-dc1ad1522e94-kube-api-access-xbvb9\") pod \"barbican-d6a3-account-create-update-jllzv\" (UID: \"67a711ad-556a-4ead-a674-dc1ad1522e94\") " pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.064004 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc4fff1-4a0a-44ab-af32-edad395bef00-operator-scripts\") pod \"barbican-db-create-vgqln\" (UID: \"cdc4fff1-4a0a-44ab-af32-edad395bef00\") " pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.064035 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mcsw\" (UniqueName: \"kubernetes.io/projected/cdc4fff1-4a0a-44ab-af32-edad395bef00-kube-api-access-2mcsw\") pod \"barbican-db-create-vgqln\" (UID: \"cdc4fff1-4a0a-44ab-af32-edad395bef00\") " pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.064065 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhc2t\" (UniqueName: \"kubernetes.io/projected/74a4954a-fbea-4371-bfd0-cb7681daa75e-kube-api-access-dhc2t\") pod \"cinder-da2b-account-create-update-pq9nx\" (UID: \"74a4954a-fbea-4371-bfd0-cb7681daa75e\") " pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.064084 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rsr\" (UniqueName: \"kubernetes.io/projected/ba577939-09d4-40a6-b1e7-98f607984111-kube-api-access-j6rsr\") pod \"cinder-db-create-m4l8d\" (UID: \"ba577939-09d4-40a6-b1e7-98f607984111\") " pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.064144 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67a711ad-556a-4ead-a674-dc1ad1522e94-operator-scripts\") pod \"barbican-d6a3-account-create-update-jllzv\" (UID: \"67a711ad-556a-4ead-a674-dc1ad1522e94\") " pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.064604 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba577939-09d4-40a6-b1e7-98f607984111-operator-scripts\") pod \"cinder-db-create-m4l8d\" (UID: \"ba577939-09d4-40a6-b1e7-98f607984111\") " pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.103769 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rsr\" (UniqueName: \"kubernetes.io/projected/ba577939-09d4-40a6-b1e7-98f607984111-kube-api-access-j6rsr\") pod \"cinder-db-create-m4l8d\" (UID: \"ba577939-09d4-40a6-b1e7-98f607984111\") " pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.106118 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7cv6p"] Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.107362 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.113718 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7cv6p"] Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.165297 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67a711ad-556a-4ead-a674-dc1ad1522e94-operator-scripts\") pod \"barbican-d6a3-account-create-update-jllzv\" (UID: \"67a711ad-556a-4ead-a674-dc1ad1522e94\") " pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.165411 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74a4954a-fbea-4371-bfd0-cb7681daa75e-operator-scripts\") pod \"cinder-da2b-account-create-update-pq9nx\" (UID: \"74a4954a-fbea-4371-bfd0-cb7681daa75e\") " pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.165456 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/67a711ad-556a-4ead-a674-dc1ad1522e94-kube-api-access-xbvb9\") pod \"barbican-d6a3-account-create-update-jllzv\" (UID: \"67a711ad-556a-4ead-a674-dc1ad1522e94\") " pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.165513 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc4fff1-4a0a-44ab-af32-edad395bef00-operator-scripts\") pod \"barbican-db-create-vgqln\" (UID: \"cdc4fff1-4a0a-44ab-af32-edad395bef00\") " pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.165538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mcsw\" (UniqueName: \"kubernetes.io/projected/cdc4fff1-4a0a-44ab-af32-edad395bef00-kube-api-access-2mcsw\") pod \"barbican-db-create-vgqln\" (UID: \"cdc4fff1-4a0a-44ab-af32-edad395bef00\") " pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.165574 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhc2t\" (UniqueName: \"kubernetes.io/projected/74a4954a-fbea-4371-bfd0-cb7681daa75e-kube-api-access-dhc2t\") pod \"cinder-da2b-account-create-update-pq9nx\" (UID: \"74a4954a-fbea-4371-bfd0-cb7681daa75e\") " pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.165853 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wmp64"] Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.166346 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67a711ad-556a-4ead-a674-dc1ad1522e94-operator-scripts\") pod \"barbican-d6a3-account-create-update-jllzv\" (UID: \"67a711ad-556a-4ead-a674-dc1ad1522e94\") " pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.166491 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc4fff1-4a0a-44ab-af32-edad395bef00-operator-scripts\") pod \"barbican-db-create-vgqln\" (UID: \"cdc4fff1-4a0a-44ab-af32-edad395bef00\") " pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.166569 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74a4954a-fbea-4371-bfd0-cb7681daa75e-operator-scripts\") pod \"cinder-da2b-account-create-update-pq9nx\" (UID: \"74a4954a-fbea-4371-bfd0-cb7681daa75e\") " pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.166830 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.168581 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.169387 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-74qgj" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.169663 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.169845 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.181791 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wmp64"] Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.190235 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.191058 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/67a711ad-556a-4ead-a674-dc1ad1522e94-kube-api-access-xbvb9\") pod \"barbican-d6a3-account-create-update-jllzv\" (UID: \"67a711ad-556a-4ead-a674-dc1ad1522e94\") " pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.204455 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mcsw\" (UniqueName: \"kubernetes.io/projected/cdc4fff1-4a0a-44ab-af32-edad395bef00-kube-api-access-2mcsw\") pod \"barbican-db-create-vgqln\" (UID: \"cdc4fff1-4a0a-44ab-af32-edad395bef00\") " pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.207350 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhc2t\" (UniqueName: \"kubernetes.io/projected/74a4954a-fbea-4371-bfd0-cb7681daa75e-kube-api-access-dhc2t\") pod \"cinder-da2b-account-create-update-pq9nx\" (UID: \"74a4954a-fbea-4371-bfd0-cb7681daa75e\") " pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.219088 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c2af-account-create-update-n7gmp"] Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.220805 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.223467 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.234979 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.239779 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c2af-account-create-update-n7gmp"] Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.250369 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.267529 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgj9m\" (UniqueName: \"kubernetes.io/projected/dc2d2d4c-93e7-437e-854f-e768a62c04ee-kube-api-access-bgj9m\") pod \"neutron-db-create-7cv6p\" (UID: \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\") " pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.267628 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-combined-ca-bundle\") pod \"keystone-db-sync-wmp64\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.267686 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jw58\" (UniqueName: \"kubernetes.io/projected/190b0a88-5610-4895-a497-36c4f6c06810-kube-api-access-6jw58\") pod \"keystone-db-sync-wmp64\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.267713 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d2d4c-93e7-437e-854f-e768a62c04ee-operator-scripts\") pod \"neutron-db-create-7cv6p\" (UID: \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\") " pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.267739 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-config-data\") pod \"keystone-db-sync-wmp64\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.309513 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.368753 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jw58\" (UniqueName: \"kubernetes.io/projected/190b0a88-5610-4895-a497-36c4f6c06810-kube-api-access-6jw58\") pod \"keystone-db-sync-wmp64\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.368806 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d2d4c-93e7-437e-854f-e768a62c04ee-operator-scripts\") pod \"neutron-db-create-7cv6p\" (UID: \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\") " pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.368839 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-config-data\") pod \"keystone-db-sync-wmp64\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.368860 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32828f3e-2f89-41d4-abcd-d4a433a53db1-operator-scripts\") pod \"neutron-c2af-account-create-update-n7gmp\" (UID: \"32828f3e-2f89-41d4-abcd-d4a433a53db1\") " pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.368900 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgj9m\" (UniqueName: \"kubernetes.io/projected/dc2d2d4c-93e7-437e-854f-e768a62c04ee-kube-api-access-bgj9m\") pod \"neutron-db-create-7cv6p\" (UID: \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\") " pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.368923 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjdz\" (UniqueName: \"kubernetes.io/projected/32828f3e-2f89-41d4-abcd-d4a433a53db1-kube-api-access-fjjdz\") pod \"neutron-c2af-account-create-update-n7gmp\" (UID: \"32828f3e-2f89-41d4-abcd-d4a433a53db1\") " pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.368962 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-combined-ca-bundle\") pod \"keystone-db-sync-wmp64\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.370084 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d2d4c-93e7-437e-854f-e768a62c04ee-operator-scripts\") pod \"neutron-db-create-7cv6p\" (UID: \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\") " pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.376240 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-combined-ca-bundle\") pod \"keystone-db-sync-wmp64\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.388721 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-config-data\") pod \"keystone-db-sync-wmp64\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.388825 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgj9m\" (UniqueName: \"kubernetes.io/projected/dc2d2d4c-93e7-437e-854f-e768a62c04ee-kube-api-access-bgj9m\") pod \"neutron-db-create-7cv6p\" (UID: \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\") " pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.391580 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jw58\" (UniqueName: \"kubernetes.io/projected/190b0a88-5610-4895-a497-36c4f6c06810-kube-api-access-6jw58\") pod \"keystone-db-sync-wmp64\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.441298 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.470937 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32828f3e-2f89-41d4-abcd-d4a433a53db1-operator-scripts\") pod \"neutron-c2af-account-create-update-n7gmp\" (UID: \"32828f3e-2f89-41d4-abcd-d4a433a53db1\") " pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.471000 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjdz\" (UniqueName: \"kubernetes.io/projected/32828f3e-2f89-41d4-abcd-d4a433a53db1-kube-api-access-fjjdz\") pod \"neutron-c2af-account-create-update-n7gmp\" (UID: \"32828f3e-2f89-41d4-abcd-d4a433a53db1\") " pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.471703 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32828f3e-2f89-41d4-abcd-d4a433a53db1-operator-scripts\") pod \"neutron-c2af-account-create-update-n7gmp\" (UID: \"32828f3e-2f89-41d4-abcd-d4a433a53db1\") " pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.483420 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.488566 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjdz\" (UniqueName: \"kubernetes.io/projected/32828f3e-2f89-41d4-abcd-d4a433a53db1-kube-api-access-fjjdz\") pod \"neutron-c2af-account-create-update-n7gmp\" (UID: \"32828f3e-2f89-41d4-abcd-d4a433a53db1\") " pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.504958 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-m4l8d"] Dec 09 03:34:11 crc kubenswrapper[4766]: W1209 03:34:11.550064 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba577939_09d4_40a6_b1e7_98f607984111.slice/crio-1e569639d5e37bca990a1a2850926b6a27aa871decfe9a1c17684ce6597a1bcb WatchSource:0}: Error finding container 1e569639d5e37bca990a1a2850926b6a27aa871decfe9a1c17684ce6597a1bcb: Status 404 returned error can't find the container with id 1e569639d5e37bca990a1a2850926b6a27aa871decfe9a1c17684ce6597a1bcb Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.657886 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.899903 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d6a3-account-create-update-jllzv"] Dec 09 03:34:11 crc kubenswrapper[4766]: I1209 03:34:11.947008 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vgqln"] Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.039827 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-da2b-account-create-update-pq9nx"] Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.151289 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m4l8d" event={"ID":"ba577939-09d4-40a6-b1e7-98f607984111","Type":"ContainerStarted","Data":"1ea189ed90e2f9b071a383669d02f3c49ad21027bdd943cae8002f36da104948"} Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.151673 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m4l8d" event={"ID":"ba577939-09d4-40a6-b1e7-98f607984111","Type":"ContainerStarted","Data":"1e569639d5e37bca990a1a2850926b6a27aa871decfe9a1c17684ce6597a1bcb"} Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.155555 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-da2b-account-create-update-pq9nx" event={"ID":"74a4954a-fbea-4371-bfd0-cb7681daa75e","Type":"ContainerStarted","Data":"1846943c34669c33894898b1faa0ebf5302900065c5d819914a8082165a15fa6"} Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.156826 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d6a3-account-create-update-jllzv" event={"ID":"67a711ad-556a-4ead-a674-dc1ad1522e94","Type":"ContainerStarted","Data":"27c310af8ed1735644d06f10c51aac0840301e4d860868f4b59d12d7a5d9e31c"} Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.157863 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vgqln" event={"ID":"cdc4fff1-4a0a-44ab-af32-edad395bef00","Type":"ContainerStarted","Data":"d8a652d61b2d82af111ad806082a4287917d82d8c7d29815d30e62179892cf53"} Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.170040 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7cv6p"] Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.189306 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wmp64"] Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.196849 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-m4l8d" podStartSLOduration=2.196828787 podStartE2EDuration="2.196828787s" podCreationTimestamp="2025-12-09 03:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:12.189101469 +0000 UTC m=+1333.898406885" watchObservedRunningTime="2025-12-09 03:34:12.196828787 +0000 UTC m=+1333.906134213" Dec 09 03:34:12 crc kubenswrapper[4766]: I1209 03:34:12.295658 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c2af-account-create-update-n7gmp"] Dec 09 03:34:12 crc kubenswrapper[4766]: W1209 03:34:12.306876 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32828f3e_2f89_41d4_abcd_d4a433a53db1.slice/crio-3f6cdee751a007d9be05791c50f6c37dc34e8d5d06f71267e64d0686ffc6fe80 WatchSource:0}: Error finding container 3f6cdee751a007d9be05791c50f6c37dc34e8d5d06f71267e64d0686ffc6fe80: Status 404 returned error can't find the container with id 3f6cdee751a007d9be05791c50f6c37dc34e8d5d06f71267e64d0686ffc6fe80 Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.170488 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba577939-09d4-40a6-b1e7-98f607984111" containerID="1ea189ed90e2f9b071a383669d02f3c49ad21027bdd943cae8002f36da104948" exitCode=0 Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.170561 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m4l8d" event={"ID":"ba577939-09d4-40a6-b1e7-98f607984111","Type":"ContainerDied","Data":"1ea189ed90e2f9b071a383669d02f3c49ad21027bdd943cae8002f36da104948"} Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.172544 4766 generic.go:334] "Generic (PLEG): container finished" podID="74a4954a-fbea-4371-bfd0-cb7681daa75e" containerID="aeb92723757cb91dd1871bac1a2f9acfdedb227f6bd6772c6fe5cd41ca358b51" exitCode=0 Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.172592 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-da2b-account-create-update-pq9nx" event={"ID":"74a4954a-fbea-4371-bfd0-cb7681daa75e","Type":"ContainerDied","Data":"aeb92723757cb91dd1871bac1a2f9acfdedb227f6bd6772c6fe5cd41ca358b51"} Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.174242 4766 generic.go:334] "Generic (PLEG): container finished" podID="67a711ad-556a-4ead-a674-dc1ad1522e94" containerID="8395f4aaedfdb13ba81601d8b8bed90b4f2057ed88097de8b79f3aa583caf210" exitCode=0 Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.174275 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d6a3-account-create-update-jllzv" event={"ID":"67a711ad-556a-4ead-a674-dc1ad1522e94","Type":"ContainerDied","Data":"8395f4aaedfdb13ba81601d8b8bed90b4f2057ed88097de8b79f3aa583caf210"} Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.180007 4766 generic.go:334] "Generic (PLEG): container finished" podID="cdc4fff1-4a0a-44ab-af32-edad395bef00" containerID="b06965f1ab86e1115e4f9ff8af07336b6f7146502c5e6422ba2fb8b5d1d20e96" exitCode=0 Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.180096 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vgqln" event={"ID":"cdc4fff1-4a0a-44ab-af32-edad395bef00","Type":"ContainerDied","Data":"b06965f1ab86e1115e4f9ff8af07336b6f7146502c5e6422ba2fb8b5d1d20e96"} Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.181877 4766 generic.go:334] "Generic (PLEG): container finished" podID="32828f3e-2f89-41d4-abcd-d4a433a53db1" containerID="37efa13168c573c890903d3df4f307a2f1971d2d65054fb7d786360d3f24f242" exitCode=0 Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.181915 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c2af-account-create-update-n7gmp" event={"ID":"32828f3e-2f89-41d4-abcd-d4a433a53db1","Type":"ContainerDied","Data":"37efa13168c573c890903d3df4f307a2f1971d2d65054fb7d786360d3f24f242"} Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.181962 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c2af-account-create-update-n7gmp" event={"ID":"32828f3e-2f89-41d4-abcd-d4a433a53db1","Type":"ContainerStarted","Data":"3f6cdee751a007d9be05791c50f6c37dc34e8d5d06f71267e64d0686ffc6fe80"} Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.183203 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wmp64" event={"ID":"190b0a88-5610-4895-a497-36c4f6c06810","Type":"ContainerStarted","Data":"d89ee93a0f95f3f037f75f0022ee03e296d7c89930abfe6d69bd9997f8e91131"} Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.184519 4766 generic.go:334] "Generic (PLEG): container finished" podID="dc2d2d4c-93e7-437e-854f-e768a62c04ee" containerID="0c25d2af5e2e7351b243503284d8e691c6a6faa53b211e11267bea67a92b5a4d" exitCode=0 Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.184549 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7cv6p" event={"ID":"dc2d2d4c-93e7-437e-854f-e768a62c04ee","Type":"ContainerDied","Data":"0c25d2af5e2e7351b243503284d8e691c6a6faa53b211e11267bea67a92b5a4d"} Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.184565 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7cv6p" event={"ID":"dc2d2d4c-93e7-437e-854f-e768a62c04ee","Type":"ContainerStarted","Data":"4d0d0b159ab91848e034d4d8d8109e1fd92c7c4bf506cffef1a0042abe8ecf26"} Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.595287 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.657384 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-586lc"] Dec 09 03:34:13 crc kubenswrapper[4766]: I1209 03:34:13.657644 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-586lc" podUID="007b7fec-0143-464a-a4bf-9cf5354e5934" containerName="dnsmasq-dns" containerID="cri-o://0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf" gracePeriod=10 Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.165853 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.196286 4766 generic.go:334] "Generic (PLEG): container finished" podID="007b7fec-0143-464a-a4bf-9cf5354e5934" containerID="0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf" exitCode=0 Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.196507 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-586lc" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.196666 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-586lc" event={"ID":"007b7fec-0143-464a-a4bf-9cf5354e5934","Type":"ContainerDied","Data":"0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf"} Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.196708 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-586lc" event={"ID":"007b7fec-0143-464a-a4bf-9cf5354e5934","Type":"ContainerDied","Data":"2c28f7e7972b4bfbbe65a248324150f845013f2525cf8731a6e08cacf8aaaae0"} Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.196732 4766 scope.go:117] "RemoveContainer" containerID="0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.230312 4766 scope.go:117] "RemoveContainer" containerID="0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.266250 4766 scope.go:117] "RemoveContainer" containerID="0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf" Dec 09 03:34:14 crc kubenswrapper[4766]: E1209 03:34:14.266740 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf\": container with ID starting with 0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf not found: ID does not exist" containerID="0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.266769 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf"} err="failed to get container status \"0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf\": rpc error: code = NotFound desc = could not find container \"0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf\": container with ID starting with 0cca4109ed201f61f20a371fd01ece975ad627bea8fd5334822d0655bf636ebf not found: ID does not exist" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.266801 4766 scope.go:117] "RemoveContainer" containerID="0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb" Dec 09 03:34:14 crc kubenswrapper[4766]: E1209 03:34:14.267556 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb\": container with ID starting with 0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb not found: ID does not exist" containerID="0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.267579 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb"} err="failed to get container status \"0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb\": rpc error: code = NotFound desc = could not find container \"0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb\": container with ID starting with 0ca58c2ba1a829739d401409a819031dd26697b02d95d3f481bb783bc5b4edcb not found: ID does not exist" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.337838 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49ml5\" (UniqueName: \"kubernetes.io/projected/007b7fec-0143-464a-a4bf-9cf5354e5934-kube-api-access-49ml5\") pod \"007b7fec-0143-464a-a4bf-9cf5354e5934\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.337895 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-config\") pod \"007b7fec-0143-464a-a4bf-9cf5354e5934\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.338002 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-dns-svc\") pod \"007b7fec-0143-464a-a4bf-9cf5354e5934\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.338065 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-nb\") pod \"007b7fec-0143-464a-a4bf-9cf5354e5934\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.338100 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-sb\") pod \"007b7fec-0143-464a-a4bf-9cf5354e5934\" (UID: \"007b7fec-0143-464a-a4bf-9cf5354e5934\") " Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.356515 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007b7fec-0143-464a-a4bf-9cf5354e5934-kube-api-access-49ml5" (OuterVolumeSpecName: "kube-api-access-49ml5") pod "007b7fec-0143-464a-a4bf-9cf5354e5934" (UID: "007b7fec-0143-464a-a4bf-9cf5354e5934"). InnerVolumeSpecName "kube-api-access-49ml5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.391867 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-config" (OuterVolumeSpecName: "config") pod "007b7fec-0143-464a-a4bf-9cf5354e5934" (UID: "007b7fec-0143-464a-a4bf-9cf5354e5934"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.391975 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "007b7fec-0143-464a-a4bf-9cf5354e5934" (UID: "007b7fec-0143-464a-a4bf-9cf5354e5934"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.393286 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "007b7fec-0143-464a-a4bf-9cf5354e5934" (UID: "007b7fec-0143-464a-a4bf-9cf5354e5934"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.394022 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "007b7fec-0143-464a-a4bf-9cf5354e5934" (UID: "007b7fec-0143-464a-a4bf-9cf5354e5934"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.439574 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49ml5\" (UniqueName: \"kubernetes.io/projected/007b7fec-0143-464a-a4bf-9cf5354e5934-kube-api-access-49ml5\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.439604 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.439615 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.439627 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.439638 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007b7fec-0143-464a-a4bf-9cf5354e5934-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.532979 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-586lc"] Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.543468 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-586lc"] Dec 09 03:34:14 crc kubenswrapper[4766]: I1209 03:34:14.859265 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007b7fec-0143-464a-a4bf-9cf5354e5934" path="/var/lib/kubelet/pods/007b7fec-0143-464a-a4bf-9cf5354e5934/volumes" Dec 09 03:34:16 crc kubenswrapper[4766]: I1209 03:34:16.235100 4766 generic.go:334] "Generic (PLEG): container finished" podID="d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a" containerID="797200790ce6d3aaa2d627d061ab0d04ee01f885e7da6724a44b88b836f80f26" exitCode=0 Dec 09 03:34:16 crc kubenswrapper[4766]: I1209 03:34:16.235177 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bkt59" event={"ID":"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a","Type":"ContainerDied","Data":"797200790ce6d3aaa2d627d061ab0d04ee01f885e7da6724a44b88b836f80f26"} Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.079478 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.106357 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.134196 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.148777 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.150038 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.160672 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.172717 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bkt59" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.213298 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67a711ad-556a-4ead-a674-dc1ad1522e94-operator-scripts\") pod \"67a711ad-556a-4ead-a674-dc1ad1522e94\" (UID: \"67a711ad-556a-4ead-a674-dc1ad1522e94\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.213469 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d2d4c-93e7-437e-854f-e768a62c04ee-operator-scripts\") pod \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\" (UID: \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.213518 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/67a711ad-556a-4ead-a674-dc1ad1522e94-kube-api-access-xbvb9\") pod \"67a711ad-556a-4ead-a674-dc1ad1522e94\" (UID: \"67a711ad-556a-4ead-a674-dc1ad1522e94\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.213582 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgj9m\" (UniqueName: \"kubernetes.io/projected/dc2d2d4c-93e7-437e-854f-e768a62c04ee-kube-api-access-bgj9m\") pod \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\" (UID: \"dc2d2d4c-93e7-437e-854f-e768a62c04ee\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.215121 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2d2d4c-93e7-437e-854f-e768a62c04ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc2d2d4c-93e7-437e-854f-e768a62c04ee" (UID: "dc2d2d4c-93e7-437e-854f-e768a62c04ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.215194 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a711ad-556a-4ead-a674-dc1ad1522e94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67a711ad-556a-4ead-a674-dc1ad1522e94" (UID: "67a711ad-556a-4ead-a674-dc1ad1522e94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.218838 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc2d2d4c-93e7-437e-854f-e768a62c04ee-kube-api-access-bgj9m" (OuterVolumeSpecName: "kube-api-access-bgj9m") pod "dc2d2d4c-93e7-437e-854f-e768a62c04ee" (UID: "dc2d2d4c-93e7-437e-854f-e768a62c04ee"). InnerVolumeSpecName "kube-api-access-bgj9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.219204 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a711ad-556a-4ead-a674-dc1ad1522e94-kube-api-access-xbvb9" (OuterVolumeSpecName: "kube-api-access-xbvb9") pod "67a711ad-556a-4ead-a674-dc1ad1522e94" (UID: "67a711ad-556a-4ead-a674-dc1ad1522e94"). InnerVolumeSpecName "kube-api-access-xbvb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.251597 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bkt59" event={"ID":"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a","Type":"ContainerDied","Data":"c892acb5b8f99d58769c30fbec9c89ed27ac4e322dd2864571e40ddcb36832f2"} Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.251631 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c892acb5b8f99d58769c30fbec9c89ed27ac4e322dd2864571e40ddcb36832f2" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.251615 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bkt59" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.253167 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vgqln" event={"ID":"cdc4fff1-4a0a-44ab-af32-edad395bef00","Type":"ContainerDied","Data":"d8a652d61b2d82af111ad806082a4287917d82d8c7d29815d30e62179892cf53"} Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.253186 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vgqln" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.253188 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8a652d61b2d82af111ad806082a4287917d82d8c7d29815d30e62179892cf53" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.254359 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c2af-account-create-update-n7gmp" event={"ID":"32828f3e-2f89-41d4-abcd-d4a433a53db1","Type":"ContainerDied","Data":"3f6cdee751a007d9be05791c50f6c37dc34e8d5d06f71267e64d0686ffc6fe80"} Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.254383 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f6cdee751a007d9be05791c50f6c37dc34e8d5d06f71267e64d0686ffc6fe80" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.254455 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c2af-account-create-update-n7gmp" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.255760 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wmp64" event={"ID":"190b0a88-5610-4895-a497-36c4f6c06810","Type":"ContainerStarted","Data":"99952e0ef23e63c630103cbcb6fa7f6bfbf1f3297c7069ddbb9d529ac929de90"} Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.257154 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7cv6p" event={"ID":"dc2d2d4c-93e7-437e-854f-e768a62c04ee","Type":"ContainerDied","Data":"4d0d0b159ab91848e034d4d8d8109e1fd92c7c4bf506cffef1a0042abe8ecf26"} Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.257176 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0d0b159ab91848e034d4d8d8109e1fd92c7c4bf506cffef1a0042abe8ecf26" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.257223 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7cv6p" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.259732 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m4l8d" event={"ID":"ba577939-09d4-40a6-b1e7-98f607984111","Type":"ContainerDied","Data":"1e569639d5e37bca990a1a2850926b6a27aa871decfe9a1c17684ce6597a1bcb"} Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.259770 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e569639d5e37bca990a1a2850926b6a27aa871decfe9a1c17684ce6597a1bcb" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.259766 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m4l8d" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.261103 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-da2b-account-create-update-pq9nx" event={"ID":"74a4954a-fbea-4371-bfd0-cb7681daa75e","Type":"ContainerDied","Data":"1846943c34669c33894898b1faa0ebf5302900065c5d819914a8082165a15fa6"} Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.261128 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1846943c34669c33894898b1faa0ebf5302900065c5d819914a8082165a15fa6" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.261176 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-da2b-account-create-update-pq9nx" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.262360 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d6a3-account-create-update-jllzv" event={"ID":"67a711ad-556a-4ead-a674-dc1ad1522e94","Type":"ContainerDied","Data":"27c310af8ed1735644d06f10c51aac0840301e4d860868f4b59d12d7a5d9e31c"} Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.262503 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27c310af8ed1735644d06f10c51aac0840301e4d860868f4b59d12d7a5d9e31c" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.262483 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d6a3-account-create-update-jllzv" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.279005 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wmp64" podStartSLOduration=1.557528118 podStartE2EDuration="7.278982568s" podCreationTimestamp="2025-12-09 03:34:11 +0000 UTC" firstStartedPulling="2025-12-09 03:34:12.199197281 +0000 UTC m=+1333.908502707" lastFinishedPulling="2025-12-09 03:34:17.920651731 +0000 UTC m=+1339.629957157" observedRunningTime="2025-12-09 03:34:18.269248806 +0000 UTC m=+1339.978554232" watchObservedRunningTime="2025-12-09 03:34:18.278982568 +0000 UTC m=+1339.988287994" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315120 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-db-sync-config-data\") pod \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315196 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6rsr\" (UniqueName: \"kubernetes.io/projected/ba577939-09d4-40a6-b1e7-98f607984111-kube-api-access-j6rsr\") pod \"ba577939-09d4-40a6-b1e7-98f607984111\" (UID: \"ba577939-09d4-40a6-b1e7-98f607984111\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315240 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba577939-09d4-40a6-b1e7-98f607984111-operator-scripts\") pod \"ba577939-09d4-40a6-b1e7-98f607984111\" (UID: \"ba577939-09d4-40a6-b1e7-98f607984111\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315267 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjhcl\" (UniqueName: \"kubernetes.io/projected/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-kube-api-access-tjhcl\") pod \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315308 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mcsw\" (UniqueName: \"kubernetes.io/projected/cdc4fff1-4a0a-44ab-af32-edad395bef00-kube-api-access-2mcsw\") pod \"cdc4fff1-4a0a-44ab-af32-edad395bef00\" (UID: \"cdc4fff1-4a0a-44ab-af32-edad395bef00\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315401 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc4fff1-4a0a-44ab-af32-edad395bef00-operator-scripts\") pod \"cdc4fff1-4a0a-44ab-af32-edad395bef00\" (UID: \"cdc4fff1-4a0a-44ab-af32-edad395bef00\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315451 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32828f3e-2f89-41d4-abcd-d4a433a53db1-operator-scripts\") pod \"32828f3e-2f89-41d4-abcd-d4a433a53db1\" (UID: \"32828f3e-2f89-41d4-abcd-d4a433a53db1\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315480 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-combined-ca-bundle\") pod \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315561 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjdz\" (UniqueName: \"kubernetes.io/projected/32828f3e-2f89-41d4-abcd-d4a433a53db1-kube-api-access-fjjdz\") pod \"32828f3e-2f89-41d4-abcd-d4a433a53db1\" (UID: \"32828f3e-2f89-41d4-abcd-d4a433a53db1\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315614 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhc2t\" (UniqueName: \"kubernetes.io/projected/74a4954a-fbea-4371-bfd0-cb7681daa75e-kube-api-access-dhc2t\") pod \"74a4954a-fbea-4371-bfd0-cb7681daa75e\" (UID: \"74a4954a-fbea-4371-bfd0-cb7681daa75e\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315648 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-config-data\") pod \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\" (UID: \"d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315680 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74a4954a-fbea-4371-bfd0-cb7681daa75e-operator-scripts\") pod \"74a4954a-fbea-4371-bfd0-cb7681daa75e\" (UID: \"74a4954a-fbea-4371-bfd0-cb7681daa75e\") " Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.315791 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba577939-09d4-40a6-b1e7-98f607984111-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba577939-09d4-40a6-b1e7-98f607984111" (UID: "ba577939-09d4-40a6-b1e7-98f607984111"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.316352 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32828f3e-2f89-41d4-abcd-d4a433a53db1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32828f3e-2f89-41d4-abcd-d4a433a53db1" (UID: "32828f3e-2f89-41d4-abcd-d4a433a53db1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.316435 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba577939-09d4-40a6-b1e7-98f607984111-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.316463 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgj9m\" (UniqueName: \"kubernetes.io/projected/dc2d2d4c-93e7-437e-854f-e768a62c04ee-kube-api-access-bgj9m\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.316480 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67a711ad-556a-4ead-a674-dc1ad1522e94-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.316491 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc2d2d4c-93e7-437e-854f-e768a62c04ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.316505 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbvb9\" (UniqueName: \"kubernetes.io/projected/67a711ad-556a-4ead-a674-dc1ad1522e94-kube-api-access-xbvb9\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.316486 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdc4fff1-4a0a-44ab-af32-edad395bef00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cdc4fff1-4a0a-44ab-af32-edad395bef00" (UID: "cdc4fff1-4a0a-44ab-af32-edad395bef00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.316602 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a4954a-fbea-4371-bfd0-cb7681daa75e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74a4954a-fbea-4371-bfd0-cb7681daa75e" (UID: "74a4954a-fbea-4371-bfd0-cb7681daa75e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.320205 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a4954a-fbea-4371-bfd0-cb7681daa75e-kube-api-access-dhc2t" (OuterVolumeSpecName: "kube-api-access-dhc2t") pod "74a4954a-fbea-4371-bfd0-cb7681daa75e" (UID: "74a4954a-fbea-4371-bfd0-cb7681daa75e"). InnerVolumeSpecName "kube-api-access-dhc2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.320913 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba577939-09d4-40a6-b1e7-98f607984111-kube-api-access-j6rsr" (OuterVolumeSpecName: "kube-api-access-j6rsr") pod "ba577939-09d4-40a6-b1e7-98f607984111" (UID: "ba577939-09d4-40a6-b1e7-98f607984111"). InnerVolumeSpecName "kube-api-access-j6rsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.321500 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32828f3e-2f89-41d4-abcd-d4a433a53db1-kube-api-access-fjjdz" (OuterVolumeSpecName: "kube-api-access-fjjdz") pod "32828f3e-2f89-41d4-abcd-d4a433a53db1" (UID: "32828f3e-2f89-41d4-abcd-d4a433a53db1"). InnerVolumeSpecName "kube-api-access-fjjdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.321570 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a" (UID: "d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.321593 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-kube-api-access-tjhcl" (OuterVolumeSpecName: "kube-api-access-tjhcl") pod "d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a" (UID: "d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a"). InnerVolumeSpecName "kube-api-access-tjhcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.321672 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc4fff1-4a0a-44ab-af32-edad395bef00-kube-api-access-2mcsw" (OuterVolumeSpecName: "kube-api-access-2mcsw") pod "cdc4fff1-4a0a-44ab-af32-edad395bef00" (UID: "cdc4fff1-4a0a-44ab-af32-edad395bef00"). InnerVolumeSpecName "kube-api-access-2mcsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.347795 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a" (UID: "d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.362419 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-config-data" (OuterVolumeSpecName: "config-data") pod "d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a" (UID: "d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419533 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdc4fff1-4a0a-44ab-af32-edad395bef00-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419561 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32828f3e-2f89-41d4-abcd-d4a433a53db1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419570 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419579 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjjdz\" (UniqueName: \"kubernetes.io/projected/32828f3e-2f89-41d4-abcd-d4a433a53db1-kube-api-access-fjjdz\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419588 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhc2t\" (UniqueName: \"kubernetes.io/projected/74a4954a-fbea-4371-bfd0-cb7681daa75e-kube-api-access-dhc2t\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419598 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419607 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74a4954a-fbea-4371-bfd0-cb7681daa75e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419617 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419626 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6rsr\" (UniqueName: \"kubernetes.io/projected/ba577939-09d4-40a6-b1e7-98f607984111-kube-api-access-j6rsr\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419634 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjhcl\" (UniqueName: \"kubernetes.io/projected/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a-kube-api-access-tjhcl\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.419643 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mcsw\" (UniqueName: \"kubernetes.io/projected/cdc4fff1-4a0a-44ab-af32-edad395bef00-kube-api-access-2mcsw\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.686099 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-jrdz9"] Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.686737 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007b7fec-0143-464a-a4bf-9cf5354e5934" containerName="init" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.686901 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="007b7fec-0143-464a-a4bf-9cf5354e5934" containerName="init" Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.686911 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba577939-09d4-40a6-b1e7-98f607984111" containerName="mariadb-database-create" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.686917 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba577939-09d4-40a6-b1e7-98f607984111" containerName="mariadb-database-create" Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.686932 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007b7fec-0143-464a-a4bf-9cf5354e5934" containerName="dnsmasq-dns" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.686938 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="007b7fec-0143-464a-a4bf-9cf5354e5934" containerName="dnsmasq-dns" Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.686974 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a4954a-fbea-4371-bfd0-cb7681daa75e" containerName="mariadb-account-create-update" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.686980 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a4954a-fbea-4371-bfd0-cb7681daa75e" containerName="mariadb-account-create-update" Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.686990 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2d2d4c-93e7-437e-854f-e768a62c04ee" containerName="mariadb-database-create" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.686995 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2d2d4c-93e7-437e-854f-e768a62c04ee" containerName="mariadb-database-create" Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.687007 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32828f3e-2f89-41d4-abcd-d4a433a53db1" containerName="mariadb-account-create-update" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687014 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="32828f3e-2f89-41d4-abcd-d4a433a53db1" containerName="mariadb-account-create-update" Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.687019 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67a711ad-556a-4ead-a674-dc1ad1522e94" containerName="mariadb-account-create-update" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687025 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a711ad-556a-4ead-a674-dc1ad1522e94" containerName="mariadb-account-create-update" Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.687051 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a" containerName="glance-db-sync" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687057 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a" containerName="glance-db-sync" Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.687075 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc4fff1-4a0a-44ab-af32-edad395bef00" containerName="mariadb-database-create" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687080 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc4fff1-4a0a-44ab-af32-edad395bef00" containerName="mariadb-database-create" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687368 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2d2d4c-93e7-437e-854f-e768a62c04ee" containerName="mariadb-database-create" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687414 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba577939-09d4-40a6-b1e7-98f607984111" containerName="mariadb-database-create" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687426 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="67a711ad-556a-4ead-a674-dc1ad1522e94" containerName="mariadb-account-create-update" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687441 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a" containerName="glance-db-sync" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687451 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="32828f3e-2f89-41d4-abcd-d4a433a53db1" containerName="mariadb-account-create-update" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687487 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a4954a-fbea-4371-bfd0-cb7681daa75e" containerName="mariadb-account-create-update" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687500 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc4fff1-4a0a-44ab-af32-edad395bef00" containerName="mariadb-database-create" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.687513 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="007b7fec-0143-464a-a4bf-9cf5354e5934" containerName="dnsmasq-dns" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.688781 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.717019 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-jrdz9"] Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.848113 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-config\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.848176 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.848221 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbf4p\" (UniqueName: \"kubernetes.io/projected/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-kube-api-access-xbf4p\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.849158 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.849280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.849332 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: E1209 03:34:18.911630 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74a4954a_fbea_4371_bfd0_cb7681daa75e.slice\": RecentStats: unable to find data in memory cache]" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.950649 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-config\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.950906 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.951007 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbf4p\" (UniqueName: \"kubernetes.io/projected/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-kube-api-access-xbf4p\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.951095 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.951186 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.951287 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.951828 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-config\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.951976 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.952403 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.952477 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.952669 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:18 crc kubenswrapper[4766]: I1209 03:34:18.972275 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbf4p\" (UniqueName: \"kubernetes.io/projected/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-kube-api-access-xbf4p\") pod \"dnsmasq-dns-7ff5475cc9-jrdz9\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:19 crc kubenswrapper[4766]: I1209 03:34:19.176031 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:19 crc kubenswrapper[4766]: I1209 03:34:19.671572 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-jrdz9"] Dec 09 03:34:20 crc kubenswrapper[4766]: I1209 03:34:20.281794 4766 generic.go:334] "Generic (PLEG): container finished" podID="3b85b07b-4055-48f6-8ce2-30fa4ec3539a" containerID="4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc" exitCode=0 Dec 09 03:34:20 crc kubenswrapper[4766]: I1209 03:34:20.281906 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" event={"ID":"3b85b07b-4055-48f6-8ce2-30fa4ec3539a","Type":"ContainerDied","Data":"4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc"} Dec 09 03:34:20 crc kubenswrapper[4766]: I1209 03:34:20.282162 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" event={"ID":"3b85b07b-4055-48f6-8ce2-30fa4ec3539a","Type":"ContainerStarted","Data":"f9c67d857f11557d368c18120628b307ed31340ae3bbe5769bc77cb5741aa5bb"} Dec 09 03:34:21 crc kubenswrapper[4766]: I1209 03:34:21.296181 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wmp64" event={"ID":"190b0a88-5610-4895-a497-36c4f6c06810","Type":"ContainerDied","Data":"99952e0ef23e63c630103cbcb6fa7f6bfbf1f3297c7069ddbb9d529ac929de90"} Dec 09 03:34:21 crc kubenswrapper[4766]: I1209 03:34:21.296122 4766 generic.go:334] "Generic (PLEG): container finished" podID="190b0a88-5610-4895-a497-36c4f6c06810" containerID="99952e0ef23e63c630103cbcb6fa7f6bfbf1f3297c7069ddbb9d529ac929de90" exitCode=0 Dec 09 03:34:21 crc kubenswrapper[4766]: I1209 03:34:21.299654 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" event={"ID":"3b85b07b-4055-48f6-8ce2-30fa4ec3539a","Type":"ContainerStarted","Data":"c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6"} Dec 09 03:34:21 crc kubenswrapper[4766]: I1209 03:34:21.299838 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:21 crc kubenswrapper[4766]: I1209 03:34:21.352475 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" podStartSLOduration=3.352450305 podStartE2EDuration="3.352450305s" podCreationTimestamp="2025-12-09 03:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:21.345900269 +0000 UTC m=+1343.055205705" watchObservedRunningTime="2025-12-09 03:34:21.352450305 +0000 UTC m=+1343.061755741" Dec 09 03:34:22 crc kubenswrapper[4766]: I1209 03:34:22.763582 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:22 crc kubenswrapper[4766]: I1209 03:34:22.919841 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-config-data\") pod \"190b0a88-5610-4895-a497-36c4f6c06810\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " Dec 09 03:34:22 crc kubenswrapper[4766]: I1209 03:34:22.919912 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-combined-ca-bundle\") pod \"190b0a88-5610-4895-a497-36c4f6c06810\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " Dec 09 03:34:22 crc kubenswrapper[4766]: I1209 03:34:22.919991 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jw58\" (UniqueName: \"kubernetes.io/projected/190b0a88-5610-4895-a497-36c4f6c06810-kube-api-access-6jw58\") pod \"190b0a88-5610-4895-a497-36c4f6c06810\" (UID: \"190b0a88-5610-4895-a497-36c4f6c06810\") " Dec 09 03:34:22 crc kubenswrapper[4766]: I1209 03:34:22.929989 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190b0a88-5610-4895-a497-36c4f6c06810-kube-api-access-6jw58" (OuterVolumeSpecName: "kube-api-access-6jw58") pod "190b0a88-5610-4895-a497-36c4f6c06810" (UID: "190b0a88-5610-4895-a497-36c4f6c06810"). InnerVolumeSpecName "kube-api-access-6jw58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:22 crc kubenswrapper[4766]: I1209 03:34:22.945126 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "190b0a88-5610-4895-a497-36c4f6c06810" (UID: "190b0a88-5610-4895-a497-36c4f6c06810"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:22 crc kubenswrapper[4766]: I1209 03:34:22.997637 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-config-data" (OuterVolumeSpecName: "config-data") pod "190b0a88-5610-4895-a497-36c4f6c06810" (UID: "190b0a88-5610-4895-a497-36c4f6c06810"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.022097 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jw58\" (UniqueName: \"kubernetes.io/projected/190b0a88-5610-4895-a497-36c4f6c06810-kube-api-access-6jw58\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.022136 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.022146 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190b0a88-5610-4895-a497-36c4f6c06810-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.329966 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wmp64" event={"ID":"190b0a88-5610-4895-a497-36c4f6c06810","Type":"ContainerDied","Data":"d89ee93a0f95f3f037f75f0022ee03e296d7c89930abfe6d69bd9997f8e91131"} Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.330313 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d89ee93a0f95f3f037f75f0022ee03e296d7c89930abfe6d69bd9997f8e91131" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.330039 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wmp64" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.596232 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bbtvk"] Dec 09 03:34:23 crc kubenswrapper[4766]: E1209 03:34:23.596644 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190b0a88-5610-4895-a497-36c4f6c06810" containerName="keystone-db-sync" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.596672 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="190b0a88-5610-4895-a497-36c4f6c06810" containerName="keystone-db-sync" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.603425 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="190b0a88-5610-4895-a497-36c4f6c06810" containerName="keystone-db-sync" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.604149 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.606759 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.607079 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.607159 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.607641 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.608170 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-74qgj" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.615427 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bbtvk"] Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.638905 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-jrdz9"] Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.639110 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" podUID="3b85b07b-4055-48f6-8ce2-30fa4ec3539a" containerName="dnsmasq-dns" containerID="cri-o://c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6" gracePeriod=10 Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.671233 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv"] Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.673875 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.713587 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv"] Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.733716 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-combined-ca-bundle\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.733816 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-scripts\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.733862 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-fernet-keys\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.733926 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-config-data\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.733939 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-credential-keys\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.733968 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5xbj\" (UniqueName: \"kubernetes.io/projected/5999f104-db61-4802-9aab-e5a762f77f30-kube-api-access-v5xbj\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836067 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz28s\" (UniqueName: \"kubernetes.io/projected/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-kube-api-access-hz28s\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836118 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836156 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-combined-ca-bundle\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836187 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836251 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-scripts\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836268 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836288 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-fernet-keys\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836333 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-config\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836369 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-config-data\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836385 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-credential-keys\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.836409 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5xbj\" (UniqueName: \"kubernetes.io/projected/5999f104-db61-4802-9aab-e5a762f77f30-kube-api-access-v5xbj\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.849654 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-combined-ca-bundle\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.857404 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.858035 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-credential-keys\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.858629 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-config-data\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.859913 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-scripts\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.860847 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-fernet-keys\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.866942 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.884723 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.889153 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.889858 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.900075 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5xbj\" (UniqueName: \"kubernetes.io/projected/5999f104-db61-4802-9aab-e5a762f77f30-kube-api-access-v5xbj\") pod \"keystone-bootstrap-bbtvk\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.912605 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-brzmt"] Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.930195 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.941487 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.944979 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.945233 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2vlnm" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.947465 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz28s\" (UniqueName: \"kubernetes.io/projected/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-kube-api-access-hz28s\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.947541 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.947630 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.947753 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.947783 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.947853 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-config\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.947994 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.949156 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-config\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.951269 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.952144 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.957110 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.973889 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:23 crc kubenswrapper[4766]: I1209 03:34:23.975682 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-brzmt"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.001172 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz28s\" (UniqueName: \"kubernetes.io/projected/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-kube-api-access-hz28s\") pod \"dnsmasq-dns-5c5cc7c5ff-ljmpv\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.008423 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.054966 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-run-httpd\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.055111 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-db-sync-config-data\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.055260 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-config-data\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.055441 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6bf\" (UniqueName: \"kubernetes.io/projected/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-kube-api-access-vf6bf\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.055571 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-scripts\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.055615 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp9rn\" (UniqueName: \"kubernetes.io/projected/ce20bb03-595e-4809-8bdc-77a8072c15e7-kube-api-access-pp9rn\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.056198 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-g8dqt"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.062378 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-scripts\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.062447 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.062494 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-config-data\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.062560 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.063439 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-log-httpd\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.063547 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-etc-machine-id\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.063589 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-combined-ca-bundle\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.095428 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.106679 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dwrn5" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.106842 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.106951 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.115637 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g8dqt"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.141431 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-97s8x"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.142753 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.148778 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ngs5s" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.149357 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.149462 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167109 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-config-data\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167162 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-config\") pod \"neutron-db-sync-g8dqt\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167186 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6bf\" (UniqueName: \"kubernetes.io/projected/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-kube-api-access-vf6bf\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167246 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-scripts\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167270 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp9rn\" (UniqueName: \"kubernetes.io/projected/ce20bb03-595e-4809-8bdc-77a8072c15e7-kube-api-access-pp9rn\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167339 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-scripts\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167363 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167390 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-config-data\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167429 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167451 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-log-httpd\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167488 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-etc-machine-id\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167512 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2qjh\" (UniqueName: \"kubernetes.io/projected/de52b533-348b-4fe0-b951-6dfe9c3f86e1-kube-api-access-q2qjh\") pod \"neutron-db-sync-g8dqt\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167532 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-combined-ca-bundle\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167560 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-combined-ca-bundle\") pod \"neutron-db-sync-g8dqt\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167577 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-run-httpd\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.167600 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-db-sync-config-data\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.176236 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-scripts\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.179194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-config-data\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.179679 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-zf6gw"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.179798 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-scripts\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.183921 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-config-data\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.185272 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-log-httpd\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.185345 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-etc-machine-id\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.186679 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.186910 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-run-httpd\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.188907 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.189002 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-combined-ca-bundle\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.196662 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-db-sync-config-data\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.202584 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6bf\" (UniqueName: \"kubernetes.io/projected/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-kube-api-access-vf6bf\") pod \"cinder-db-sync-brzmt\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.206230 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp9rn\" (UniqueName: \"kubernetes.io/projected/ce20bb03-595e-4809-8bdc-77a8072c15e7-kube-api-access-pp9rn\") pod \"ceilometer-0\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.206631 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zf6gw"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.206725 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.222531 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-97s8x"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.222592 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.223142 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.223162 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-slkfh" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.256493 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-24bc9"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.264474 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-24bc9"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.264602 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.271467 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-config-data\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.271506 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-combined-ca-bundle\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.271564 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2qjh\" (UniqueName: \"kubernetes.io/projected/de52b533-348b-4fe0-b951-6dfe9c3f86e1-kube-api-access-q2qjh\") pod \"neutron-db-sync-g8dqt\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.271584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpc69\" (UniqueName: \"kubernetes.io/projected/51f111f0-cbfb-4181-ad96-30f654851763-kube-api-access-mpc69\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.271608 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f111f0-cbfb-4181-ad96-30f654851763-logs\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.271626 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-scripts\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.271643 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-combined-ca-bundle\") pod \"neutron-db-sync-g8dqt\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.271689 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-config\") pod \"neutron-db-sync-g8dqt\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.275140 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-config\") pod \"neutron-db-sync-g8dqt\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.286753 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-combined-ca-bundle\") pod \"neutron-db-sync-g8dqt\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.286791 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2qjh\" (UniqueName: \"kubernetes.io/projected/de52b533-348b-4fe0-b951-6dfe9c3f86e1-kube-api-access-q2qjh\") pod \"neutron-db-sync-g8dqt\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.325060 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.334182 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.345883 4766 generic.go:334] "Generic (PLEG): container finished" podID="3b85b07b-4055-48f6-8ce2-30fa4ec3539a" containerID="c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6" exitCode=0 Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.345923 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" event={"ID":"3b85b07b-4055-48f6-8ce2-30fa4ec3539a","Type":"ContainerDied","Data":"c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6"} Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.345949 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" event={"ID":"3b85b07b-4055-48f6-8ce2-30fa4ec3539a","Type":"ContainerDied","Data":"f9c67d857f11557d368c18120628b307ed31340ae3bbe5769bc77cb5741aa5bb"} Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.345964 4766 scope.go:117] "RemoveContainer" containerID="c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.346072 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-jrdz9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.380658 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-brzmt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381172 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381258 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-config-data\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381445 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-db-sync-config-data\") pod \"barbican-db-sync-zf6gw\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381522 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-combined-ca-bundle\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381587 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cmm8\" (UniqueName: \"kubernetes.io/projected/6d08c90c-fad3-42ae-8950-8d57a79f9654-kube-api-access-8cmm8\") pod \"barbican-db-sync-zf6gw\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381617 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-combined-ca-bundle\") pod \"barbican-db-sync-zf6gw\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381647 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-config\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381664 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qgv\" (UniqueName: \"kubernetes.io/projected/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-kube-api-access-l7qgv\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381722 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpc69\" (UniqueName: \"kubernetes.io/projected/51f111f0-cbfb-4181-ad96-30f654851763-kube-api-access-mpc69\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381743 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381780 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f111f0-cbfb-4181-ad96-30f654851763-logs\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381803 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-scripts\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381828 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.381932 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.382311 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f111f0-cbfb-4181-ad96-30f654851763-logs\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.385825 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-combined-ca-bundle\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.391063 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-config-data\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.398253 4766 scope.go:117] "RemoveContainer" containerID="4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.398613 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-scripts\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.401973 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpc69\" (UniqueName: \"kubernetes.io/projected/51f111f0-cbfb-4181-ad96-30f654851763-kube-api-access-mpc69\") pod \"placement-db-sync-97s8x\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.423544 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.478710 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.483449 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-svc\") pod \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.483529 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-config\") pod \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.483580 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-swift-storage-0\") pod \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.483655 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-nb\") pod \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.483750 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbf4p\" (UniqueName: \"kubernetes.io/projected/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-kube-api-access-xbf4p\") pod \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.483769 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-sb\") pod \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\" (UID: \"3b85b07b-4055-48f6-8ce2-30fa4ec3539a\") " Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.483975 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.484036 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.484069 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-db-sync-config-data\") pod \"barbican-db-sync-zf6gw\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.484103 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cmm8\" (UniqueName: \"kubernetes.io/projected/6d08c90c-fad3-42ae-8950-8d57a79f9654-kube-api-access-8cmm8\") pod \"barbican-db-sync-zf6gw\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.484122 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-combined-ca-bundle\") pod \"barbican-db-sync-zf6gw\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.484145 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-config\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.484159 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qgv\" (UniqueName: \"kubernetes.io/projected/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-kube-api-access-l7qgv\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.484185 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.484231 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.484482 4766 scope.go:117] "RemoveContainer" containerID="c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.485264 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.485419 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-config\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.485340 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.488538 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-db-sync-config-data\") pod \"barbican-db-sync-zf6gw\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.488761 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: E1209 03:34:24.488869 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6\": container with ID starting with c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6 not found: ID does not exist" containerID="c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.488904 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6"} err="failed to get container status \"c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6\": rpc error: code = NotFound desc = could not find container \"c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6\": container with ID starting with c35ef55d8c7934d9e0e5cbfeb9f466890ad1cbb909f4096f97184ab11fc344a6 not found: ID does not exist" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.488926 4766 scope.go:117] "RemoveContainer" containerID="4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.489112 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.489438 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-kube-api-access-xbf4p" (OuterVolumeSpecName: "kube-api-access-xbf4p") pod "3b85b07b-4055-48f6-8ce2-30fa4ec3539a" (UID: "3b85b07b-4055-48f6-8ce2-30fa4ec3539a"). InnerVolumeSpecName "kube-api-access-xbf4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:24 crc kubenswrapper[4766]: E1209 03:34:24.495890 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc\": container with ID starting with 4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc not found: ID does not exist" containerID="4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.495962 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc"} err="failed to get container status \"4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc\": rpc error: code = NotFound desc = could not find container \"4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc\": container with ID starting with 4bbf229d277d9d67cd3be283f0085fb6ebb6be73440d939f1ac479b154773ffc not found: ID does not exist" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.517854 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-combined-ca-bundle\") pod \"barbican-db-sync-zf6gw\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.524050 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qgv\" (UniqueName: \"kubernetes.io/projected/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-kube-api-access-l7qgv\") pod \"dnsmasq-dns-8b5c85b87-24bc9\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.570763 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cmm8\" (UniqueName: \"kubernetes.io/projected/6d08c90c-fad3-42ae-8950-8d57a79f9654-kube-api-access-8cmm8\") pod \"barbican-db-sync-zf6gw\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.574625 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.583780 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.585420 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbf4p\" (UniqueName: \"kubernetes.io/projected/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-kube-api-access-xbf4p\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.645873 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bbtvk"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.652062 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b85b07b-4055-48f6-8ce2-30fa4ec3539a" (UID: "3b85b07b-4055-48f6-8ce2-30fa4ec3539a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.687427 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.697145 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-config" (OuterVolumeSpecName: "config") pod "3b85b07b-4055-48f6-8ce2-30fa4ec3539a" (UID: "3b85b07b-4055-48f6-8ce2-30fa4ec3539a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.741930 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b85b07b-4055-48f6-8ce2-30fa4ec3539a" (UID: "3b85b07b-4055-48f6-8ce2-30fa4ec3539a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.751810 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b85b07b-4055-48f6-8ce2-30fa4ec3539a" (UID: "3b85b07b-4055-48f6-8ce2-30fa4ec3539a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.756922 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.770834 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b85b07b-4055-48f6-8ce2-30fa4ec3539a" (UID: "3b85b07b-4055-48f6-8ce2-30fa4ec3539a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.789108 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.789136 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.789145 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.789154 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b85b07b-4055-48f6-8ce2-30fa4ec3539a-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.801751 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:34:24 crc kubenswrapper[4766]: E1209 03:34:24.802592 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b85b07b-4055-48f6-8ce2-30fa4ec3539a" containerName="dnsmasq-dns" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.805817 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b85b07b-4055-48f6-8ce2-30fa4ec3539a" containerName="dnsmasq-dns" Dec 09 03:34:24 crc kubenswrapper[4766]: E1209 03:34:24.806311 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b85b07b-4055-48f6-8ce2-30fa4ec3539a" containerName="init" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.806411 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b85b07b-4055-48f6-8ce2-30fa4ec3539a" containerName="init" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.806851 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b85b07b-4055-48f6-8ce2-30fa4ec3539a" containerName="dnsmasq-dns" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.808865 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.809293 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.816152 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hqfv8" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.816508 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.816532 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.816724 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.891644 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.891677 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-config-data\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.891707 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4r92\" (UniqueName: \"kubernetes.io/projected/6b4bf031-86cb-480f-9d17-abfae1f00bfb-kube-api-access-b4r92\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.891725 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.891758 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-logs\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.891778 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.891855 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-scripts\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.891870 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.908020 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.909791 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.909887 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.914935 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 09 03:34:24 crc kubenswrapper[4766]: I1209 03:34:24.915202 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993420 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993467 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4r92\" (UniqueName: \"kubernetes.io/projected/6b4bf031-86cb-480f-9d17-abfae1f00bfb-kube-api-access-b4r92\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993494 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993541 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993565 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-logs\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993618 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993688 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993714 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993779 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mkxp\" (UniqueName: \"kubernetes.io/projected/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-kube-api-access-2mkxp\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993809 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-scripts\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993833 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993860 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993923 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993947 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993972 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-config-data\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.993992 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:24.994665 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.004397 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.005714 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-scripts\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.009703 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.011790 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-logs\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.013883 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-config-data\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.019185 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.039339 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4r92\" (UniqueName: \"kubernetes.io/projected/6b4bf031-86cb-480f-9d17-abfae1f00bfb-kube-api-access-b4r92\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.052055 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.071397 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.107711 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.107889 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.107919 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.108420 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.108563 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mkxp\" (UniqueName: \"kubernetes.io/projected/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-kube-api-access-2mkxp\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.108619 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.108747 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.110260 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.110308 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.110361 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.112044 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.113854 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.115688 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.118256 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.121354 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.133901 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mkxp\" (UniqueName: \"kubernetes.io/projected/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-kube-api-access-2mkxp\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.144073 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.232046 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-brzmt"] Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.267274 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.292686 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-jrdz9"] Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.302070 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.303848 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-jrdz9"] Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.377087 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g8dqt"] Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.378805 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" event={"ID":"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65","Type":"ContainerStarted","Data":"56e9c988dcbe8c0f1375dd79b54d4e57ac1319d161b10612f76da1c710edd09d"} Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.378892 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" podUID="38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" containerName="init" containerID="cri-o://d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4" gracePeriod=10 Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.382968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerStarted","Data":"819aa22194b46f4e5d5a170c7b8e78bae8b5f02bb4dbcc9dc57e7617b0c0081a"} Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.389701 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbtvk" event={"ID":"5999f104-db61-4802-9aab-e5a762f77f30","Type":"ContainerStarted","Data":"66c5f0930843a419a9a7e1f1288602d3fa2b7d870624997db5c7a06ab2735603"} Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.391891 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-brzmt" event={"ID":"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc","Type":"ContainerStarted","Data":"4be0abc05661091c0a340bbcdbbed57455a690b1f18ad003d19eab3fad2fb281"} Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.572286 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-97s8x"] Dec 09 03:34:25 crc kubenswrapper[4766]: W1209 03:34:25.581498 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe22f23a_5dcd_4bae_9d03_e0873c6e22cb.slice/crio-164f27c00b0172834faa7b55e0fcc54ce5cf6abfac5656d70823f49ec7955b94 WatchSource:0}: Error finding container 164f27c00b0172834faa7b55e0fcc54ce5cf6abfac5656d70823f49ec7955b94: Status 404 returned error can't find the container with id 164f27c00b0172834faa7b55e0fcc54ce5cf6abfac5656d70823f49ec7955b94 Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.601842 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-24bc9"] Dec 09 03:34:25 crc kubenswrapper[4766]: I1209 03:34:25.612436 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zf6gw"] Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.002560 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.120570 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.321843 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.369461 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.399202 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.428262 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.438631 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-nb\") pod \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.438763 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-sb\") pod \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.438870 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-svc\") pod \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.438927 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-config\") pod \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.438995 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-swift-storage-0\") pod \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.439021 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz28s\" (UniqueName: \"kubernetes.io/projected/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-kube-api-access-hz28s\") pod \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\" (UID: \"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65\") " Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.448761 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-97s8x" event={"ID":"51f111f0-cbfb-4181-ad96-30f654851763","Type":"ContainerStarted","Data":"3847638c2032a1b0e75b94451976e9f3118e4f23a705dc90c2d763a39bd3eb50"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.480650 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-kube-api-access-hz28s" (OuterVolumeSpecName: "kube-api-access-hz28s") pod "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" (UID: "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65"). InnerVolumeSpecName "kube-api-access-hz28s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.483955 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8dqt" event={"ID":"de52b533-348b-4fe0-b951-6dfe9c3f86e1","Type":"ContainerStarted","Data":"d10a4a20cfee2935e2676d326b1df6fbc314a11ef1613497c99fe595048642e5"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.483991 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8dqt" event={"ID":"de52b533-348b-4fe0-b951-6dfe9c3f86e1","Type":"ContainerStarted","Data":"937b42ae94d6b3a763e0cdf3a771994aebb123eccce919c132c9b7da5473e463"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.489493 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zf6gw" event={"ID":"6d08c90c-fad3-42ae-8950-8d57a79f9654","Type":"ContainerStarted","Data":"86430353afd214cacdb01d48b5f42087d90dda8d88daa52bbeb2b100402f2c49"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.495616 4766 generic.go:334] "Generic (PLEG): container finished" podID="fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" containerID="52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1" exitCode=0 Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.495682 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" event={"ID":"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb","Type":"ContainerDied","Data":"52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.495704 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" event={"ID":"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb","Type":"ContainerStarted","Data":"164f27c00b0172834faa7b55e0fcc54ce5cf6abfac5656d70823f49ec7955b94"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.496190 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-config" (OuterVolumeSpecName: "config") pod "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" (UID: "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.501102 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" (UID: "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.501297 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" (UID: "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.507191 4766 generic.go:334] "Generic (PLEG): container finished" podID="38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" containerID="d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4" exitCode=0 Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.507375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" event={"ID":"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65","Type":"ContainerDied","Data":"d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.507422 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" event={"ID":"38b2ce4f-2aaa-4e50-a46a-8e3b63522b65","Type":"ContainerDied","Data":"56e9c988dcbe8c0f1375dd79b54d4e57ac1319d161b10612f76da1c710edd09d"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.507441 4766 scope.go:117] "RemoveContainer" containerID="d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.507445 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-g8dqt" podStartSLOduration=3.50742384 podStartE2EDuration="3.50742384s" podCreationTimestamp="2025-12-09 03:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:26.503872075 +0000 UTC m=+1348.213177501" watchObservedRunningTime="2025-12-09 03:34:26.50742384 +0000 UTC m=+1348.216729266" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.507565 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.520699 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b4bf031-86cb-480f-9d17-abfae1f00bfb","Type":"ContainerStarted","Data":"5abcf6b3b1d1a76978a19853122ea5ce85fb077d8308b1dca973d58b071a66af"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.524253 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbtvk" event={"ID":"5999f104-db61-4802-9aab-e5a762f77f30","Type":"ContainerStarted","Data":"b0a393cd167a7adea99e8e7fe9ff0ab043ea9c4d159ff79b0dade14d2474032d"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.525657 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7cfa604-5696-4e5a-8fdc-5a110586bb6e","Type":"ContainerStarted","Data":"a6d65e8d44de2485c42506a35d3e74a25e5d3387346e0a835930cb8a7eb50a85"} Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.542908 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" (UID: "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.545121 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.545149 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.545162 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.545176 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.545191 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz28s\" (UniqueName: \"kubernetes.io/projected/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-kube-api-access-hz28s\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.574877 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" (UID: "38b2ce4f-2aaa-4e50-a46a-8e3b63522b65"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.579338 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bbtvk" podStartSLOduration=3.579312784 podStartE2EDuration="3.579312784s" podCreationTimestamp="2025-12-09 03:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:26.56280459 +0000 UTC m=+1348.272110026" watchObservedRunningTime="2025-12-09 03:34:26.579312784 +0000 UTC m=+1348.288618210" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.647445 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.654095 4766 scope.go:117] "RemoveContainer" containerID="d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4" Dec 09 03:34:26 crc kubenswrapper[4766]: E1209 03:34:26.654657 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4\": container with ID starting with d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4 not found: ID does not exist" containerID="d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.654687 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4"} err="failed to get container status \"d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4\": rpc error: code = NotFound desc = could not find container \"d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4\": container with ID starting with d93a512d26a122dea88cf07b1ec781a4e9e6528385c126ff82db6ac270185ba4 not found: ID does not exist" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.916020 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b85b07b-4055-48f6-8ce2-30fa4ec3539a" path="/var/lib/kubelet/pods/3b85b07b-4055-48f6-8ce2-30fa4ec3539a/volumes" Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.976394 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv"] Dec 09 03:34:26 crc kubenswrapper[4766]: I1209 03:34:26.983164 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-ljmpv"] Dec 09 03:34:27 crc kubenswrapper[4766]: I1209 03:34:27.536988 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b4bf031-86cb-480f-9d17-abfae1f00bfb","Type":"ContainerStarted","Data":"8b462bf0e51192fbcc92f421784e85b9f7ac78f87898fa545457b8b86e638d4d"} Dec 09 03:34:27 crc kubenswrapper[4766]: I1209 03:34:27.538811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7cfa604-5696-4e5a-8fdc-5a110586bb6e","Type":"ContainerStarted","Data":"d37421ddaf5d43f00a607dafa3c68ca43e724521c9243c092e73b2b859c98fc0"} Dec 09 03:34:27 crc kubenswrapper[4766]: I1209 03:34:27.540955 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" event={"ID":"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb","Type":"ContainerStarted","Data":"1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45"} Dec 09 03:34:27 crc kubenswrapper[4766]: I1209 03:34:27.542155 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:27 crc kubenswrapper[4766]: I1209 03:34:27.576638 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" podStartSLOduration=3.576616205 podStartE2EDuration="3.576616205s" podCreationTimestamp="2025-12-09 03:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:27.558699533 +0000 UTC m=+1349.268004959" watchObservedRunningTime="2025-12-09 03:34:27.576616205 +0000 UTC m=+1349.285921631" Dec 09 03:34:28 crc kubenswrapper[4766]: I1209 03:34:28.555870 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7cfa604-5696-4e5a-8fdc-5a110586bb6e","Type":"ContainerStarted","Data":"1152d06eee1332da2d90fba14d50483aa683ce65b6a240998a37a2c33cb42034"} Dec 09 03:34:28 crc kubenswrapper[4766]: I1209 03:34:28.555962 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerName="glance-log" containerID="cri-o://d37421ddaf5d43f00a607dafa3c68ca43e724521c9243c092e73b2b859c98fc0" gracePeriod=30 Dec 09 03:34:28 crc kubenswrapper[4766]: I1209 03:34:28.556046 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerName="glance-httpd" containerID="cri-o://1152d06eee1332da2d90fba14d50483aa683ce65b6a240998a37a2c33cb42034" gracePeriod=30 Dec 09 03:34:28 crc kubenswrapper[4766]: I1209 03:34:28.571127 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b4bf031-86cb-480f-9d17-abfae1f00bfb","Type":"ContainerStarted","Data":"283dcc3fec23af93e057bec9e973a93f4b5ea0cb63e4335d461e9f635862c2d9"} Dec 09 03:34:28 crc kubenswrapper[4766]: I1209 03:34:28.571344 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerName="glance-log" containerID="cri-o://8b462bf0e51192fbcc92f421784e85b9f7ac78f87898fa545457b8b86e638d4d" gracePeriod=30 Dec 09 03:34:28 crc kubenswrapper[4766]: I1209 03:34:28.571468 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerName="glance-httpd" containerID="cri-o://283dcc3fec23af93e057bec9e973a93f4b5ea0cb63e4335d461e9f635862c2d9" gracePeriod=30 Dec 09 03:34:28 crc kubenswrapper[4766]: I1209 03:34:28.588116 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.588098327 podStartE2EDuration="5.588098327s" podCreationTimestamp="2025-12-09 03:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:28.58672679 +0000 UTC m=+1350.296032216" watchObservedRunningTime="2025-12-09 03:34:28.588098327 +0000 UTC m=+1350.297403753" Dec 09 03:34:28 crc kubenswrapper[4766]: I1209 03:34:28.608760 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.608739422 podStartE2EDuration="5.608739422s" podCreationTimestamp="2025-12-09 03:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:28.604457608 +0000 UTC m=+1350.313763034" watchObservedRunningTime="2025-12-09 03:34:28.608739422 +0000 UTC m=+1350.318044838" Dec 09 03:34:28 crc kubenswrapper[4766]: I1209 03:34:28.849735 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" path="/var/lib/kubelet/pods/38b2ce4f-2aaa-4e50-a46a-8e3b63522b65/volumes" Dec 09 03:34:29 crc kubenswrapper[4766]: E1209 03:34:29.173486 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b4bf031_86cb_480f_9d17_abfae1f00bfb.slice/crio-conmon-8b462bf0e51192fbcc92f421784e85b9f7ac78f87898fa545457b8b86e638d4d.scope\": RecentStats: unable to find data in memory cache]" Dec 09 03:34:29 crc kubenswrapper[4766]: I1209 03:34:29.584485 4766 generic.go:334] "Generic (PLEG): container finished" podID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerID="1152d06eee1332da2d90fba14d50483aa683ce65b6a240998a37a2c33cb42034" exitCode=0 Dec 09 03:34:29 crc kubenswrapper[4766]: I1209 03:34:29.584900 4766 generic.go:334] "Generic (PLEG): container finished" podID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerID="d37421ddaf5d43f00a607dafa3c68ca43e724521c9243c092e73b2b859c98fc0" exitCode=143 Dec 09 03:34:29 crc kubenswrapper[4766]: I1209 03:34:29.584580 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7cfa604-5696-4e5a-8fdc-5a110586bb6e","Type":"ContainerDied","Data":"1152d06eee1332da2d90fba14d50483aa683ce65b6a240998a37a2c33cb42034"} Dec 09 03:34:29 crc kubenswrapper[4766]: I1209 03:34:29.584999 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7cfa604-5696-4e5a-8fdc-5a110586bb6e","Type":"ContainerDied","Data":"d37421ddaf5d43f00a607dafa3c68ca43e724521c9243c092e73b2b859c98fc0"} Dec 09 03:34:29 crc kubenswrapper[4766]: I1209 03:34:29.587583 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerID="283dcc3fec23af93e057bec9e973a93f4b5ea0cb63e4335d461e9f635862c2d9" exitCode=0 Dec 09 03:34:29 crc kubenswrapper[4766]: I1209 03:34:29.587607 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerID="8b462bf0e51192fbcc92f421784e85b9f7ac78f87898fa545457b8b86e638d4d" exitCode=143 Dec 09 03:34:29 crc kubenswrapper[4766]: I1209 03:34:29.587704 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b4bf031-86cb-480f-9d17-abfae1f00bfb","Type":"ContainerDied","Data":"283dcc3fec23af93e057bec9e973a93f4b5ea0cb63e4335d461e9f635862c2d9"} Dec 09 03:34:29 crc kubenswrapper[4766]: I1209 03:34:29.587752 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b4bf031-86cb-480f-9d17-abfae1f00bfb","Type":"ContainerDied","Data":"8b462bf0e51192fbcc92f421784e85b9f7ac78f87898fa545457b8b86e638d4d"} Dec 09 03:34:30 crc kubenswrapper[4766]: I1209 03:34:30.605359 4766 generic.go:334] "Generic (PLEG): container finished" podID="5999f104-db61-4802-9aab-e5a762f77f30" containerID="b0a393cd167a7adea99e8e7fe9ff0ab043ea9c4d159ff79b0dade14d2474032d" exitCode=0 Dec 09 03:34:30 crc kubenswrapper[4766]: I1209 03:34:30.605449 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbtvk" event={"ID":"5999f104-db61-4802-9aab-e5a762f77f30","Type":"ContainerDied","Data":"b0a393cd167a7adea99e8e7fe9ff0ab043ea9c4d159ff79b0dade14d2474032d"} Dec 09 03:34:34 crc kubenswrapper[4766]: I1209 03:34:34.586399 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:34:34 crc kubenswrapper[4766]: I1209 03:34:34.658351 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2bnr"] Dec 09 03:34:34 crc kubenswrapper[4766]: I1209 03:34:34.658671 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" podUID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerName="dnsmasq-dns" containerID="cri-o://d22e14114255fd970a5ecfdbdacf22b9307b534b5656bbb1c433a44410595da8" gracePeriod=10 Dec 09 03:34:35 crc kubenswrapper[4766]: I1209 03:34:35.665978 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerID="d22e14114255fd970a5ecfdbdacf22b9307b534b5656bbb1c433a44410595da8" exitCode=0 Dec 09 03:34:35 crc kubenswrapper[4766]: I1209 03:34:35.666021 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" event={"ID":"ba8836f7-6293-4cfc-a615-5c2517576d06","Type":"ContainerDied","Data":"d22e14114255fd970a5ecfdbdacf22b9307b534b5656bbb1c433a44410595da8"} Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.623757 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.631780 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.633193 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.658556 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-httpd-run\") pod \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.658659 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-public-tls-certs\") pod \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.659821 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-combined-ca-bundle\") pod \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.659862 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4r92\" (UniqueName: \"kubernetes.io/projected/6b4bf031-86cb-480f-9d17-abfae1f00bfb-kube-api-access-b4r92\") pod \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.659993 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-config-data\") pod \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.660025 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-scripts\") pod \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.660082 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-logs\") pod \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.660115 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\" (UID: \"6b4bf031-86cb-480f-9d17-abfae1f00bfb\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.666167 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-scripts" (OuterVolumeSpecName: "scripts") pod "6b4bf031-86cb-480f-9d17-abfae1f00bfb" (UID: "6b4bf031-86cb-480f-9d17-abfae1f00bfb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.666427 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6b4bf031-86cb-480f-9d17-abfae1f00bfb" (UID: "6b4bf031-86cb-480f-9d17-abfae1f00bfb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.666551 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-logs" (OuterVolumeSpecName: "logs") pod "6b4bf031-86cb-480f-9d17-abfae1f00bfb" (UID: "6b4bf031-86cb-480f-9d17-abfae1f00bfb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.669476 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4bf031-86cb-480f-9d17-abfae1f00bfb-kube-api-access-b4r92" (OuterVolumeSpecName: "kube-api-access-b4r92") pod "6b4bf031-86cb-480f-9d17-abfae1f00bfb" (UID: "6b4bf031-86cb-480f-9d17-abfae1f00bfb"). InnerVolumeSpecName "kube-api-access-b4r92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.672265 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "6b4bf031-86cb-480f-9d17-abfae1f00bfb" (UID: "6b4bf031-86cb-480f-9d17-abfae1f00bfb"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.694405 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b4bf031-86cb-480f-9d17-abfae1f00bfb" (UID: "6b4bf031-86cb-480f-9d17-abfae1f00bfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.701712 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.702457 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6b4bf031-86cb-480f-9d17-abfae1f00bfb","Type":"ContainerDied","Data":"5abcf6b3b1d1a76978a19853122ea5ce85fb077d8308b1dca973d58b071a66af"} Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.702529 4766 scope.go:117] "RemoveContainer" containerID="283dcc3fec23af93e057bec9e973a93f4b5ea0cb63e4335d461e9f635862c2d9" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.709602 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbtvk" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.709892 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbtvk" event={"ID":"5999f104-db61-4802-9aab-e5a762f77f30","Type":"ContainerDied","Data":"66c5f0930843a419a9a7e1f1288602d3fa2b7d870624997db5c7a06ab2735603"} Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.709935 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66c5f0930843a419a9a7e1f1288602d3fa2b7d870624997db5c7a06ab2735603" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.716148 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7cfa604-5696-4e5a-8fdc-5a110586bb6e","Type":"ContainerDied","Data":"a6d65e8d44de2485c42506a35d3e74a25e5d3387346e0a835930cb8a7eb50a85"} Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.716266 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.725338 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-config-data" (OuterVolumeSpecName: "config-data") pod "6b4bf031-86cb-480f-9d17-abfae1f00bfb" (UID: "6b4bf031-86cb-480f-9d17-abfae1f00bfb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.762745 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763007 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-internal-tls-certs\") pod \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763052 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-scripts\") pod \"5999f104-db61-4802-9aab-e5a762f77f30\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763096 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-credential-keys\") pod \"5999f104-db61-4802-9aab-e5a762f77f30\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763138 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-config-data\") pod \"5999f104-db61-4802-9aab-e5a762f77f30\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763174 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-fernet-keys\") pod \"5999f104-db61-4802-9aab-e5a762f77f30\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763194 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-scripts\") pod \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763237 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mkxp\" (UniqueName: \"kubernetes.io/projected/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-kube-api-access-2mkxp\") pod \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763284 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-httpd-run\") pod \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763303 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5xbj\" (UniqueName: \"kubernetes.io/projected/5999f104-db61-4802-9aab-e5a762f77f30-kube-api-access-v5xbj\") pod \"5999f104-db61-4802-9aab-e5a762f77f30\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763341 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-config-data\") pod \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763355 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-logs\") pod \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763423 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-combined-ca-bundle\") pod \"5999f104-db61-4802-9aab-e5a762f77f30\" (UID: \"5999f104-db61-4802-9aab-e5a762f77f30\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.763441 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-combined-ca-bundle\") pod \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.764863 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.764882 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4r92\" (UniqueName: \"kubernetes.io/projected/6b4bf031-86cb-480f-9d17-abfae1f00bfb-kube-api-access-b4r92\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.764892 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.764899 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.764910 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.764934 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.764946 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b4bf031-86cb-480f-9d17-abfae1f00bfb-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.768692 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f7cfa604-5696-4e5a-8fdc-5a110586bb6e" (UID: "f7cfa604-5696-4e5a-8fdc-5a110586bb6e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.769688 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-logs" (OuterVolumeSpecName: "logs") pod "f7cfa604-5696-4e5a-8fdc-5a110586bb6e" (UID: "f7cfa604-5696-4e5a-8fdc-5a110586bb6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.771082 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "f7cfa604-5696-4e5a-8fdc-5a110586bb6e" (UID: "f7cfa604-5696-4e5a-8fdc-5a110586bb6e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.771127 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-kube-api-access-2mkxp" (OuterVolumeSpecName: "kube-api-access-2mkxp") pod "f7cfa604-5696-4e5a-8fdc-5a110586bb6e" (UID: "f7cfa604-5696-4e5a-8fdc-5a110586bb6e"). InnerVolumeSpecName "kube-api-access-2mkxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.771222 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5999f104-db61-4802-9aab-e5a762f77f30" (UID: "5999f104-db61-4802-9aab-e5a762f77f30"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.772555 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6b4bf031-86cb-480f-9d17-abfae1f00bfb" (UID: "6b4bf031-86cb-480f-9d17-abfae1f00bfb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.774054 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5999f104-db61-4802-9aab-e5a762f77f30" (UID: "5999f104-db61-4802-9aab-e5a762f77f30"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.776901 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-scripts" (OuterVolumeSpecName: "scripts") pod "5999f104-db61-4802-9aab-e5a762f77f30" (UID: "5999f104-db61-4802-9aab-e5a762f77f30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.782268 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-scripts" (OuterVolumeSpecName: "scripts") pod "f7cfa604-5696-4e5a-8fdc-5a110586bb6e" (UID: "f7cfa604-5696-4e5a-8fdc-5a110586bb6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.790545 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5999f104-db61-4802-9aab-e5a762f77f30-kube-api-access-v5xbj" (OuterVolumeSpecName: "kube-api-access-v5xbj") pod "5999f104-db61-4802-9aab-e5a762f77f30" (UID: "5999f104-db61-4802-9aab-e5a762f77f30"). InnerVolumeSpecName "kube-api-access-v5xbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.795164 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.796165 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5999f104-db61-4802-9aab-e5a762f77f30" (UID: "5999f104-db61-4802-9aab-e5a762f77f30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.799367 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-config-data" (OuterVolumeSpecName: "config-data") pod "5999f104-db61-4802-9aab-e5a762f77f30" (UID: "5999f104-db61-4802-9aab-e5a762f77f30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.811368 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7cfa604-5696-4e5a-8fdc-5a110586bb6e" (UID: "f7cfa604-5696-4e5a-8fdc-5a110586bb6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: E1209 03:34:37.819759 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-internal-tls-certs podName:f7cfa604-5696-4e5a-8fdc-5a110586bb6e nodeName:}" failed. No retries permitted until 2025-12-09 03:34:38.319729289 +0000 UTC m=+1360.029034715 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-internal-tls-certs") pod "f7cfa604-5696-4e5a-8fdc-5a110586bb6e" (UID: "f7cfa604-5696-4e5a-8fdc-5a110586bb6e") : error deleting /var/lib/kubelet/pods/f7cfa604-5696-4e5a-8fdc-5a110586bb6e/volume-subpaths: remove /var/lib/kubelet/pods/f7cfa604-5696-4e5a-8fdc-5a110586bb6e/volume-subpaths: no such file or directory Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.823011 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-config-data" (OuterVolumeSpecName: "config-data") pod "f7cfa604-5696-4e5a-8fdc-5a110586bb6e" (UID: "f7cfa604-5696-4e5a-8fdc-5a110586bb6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866855 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866904 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866917 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866926 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866935 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866961 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866970 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866980 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b4bf031-86cb-480f-9d17-abfae1f00bfb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866988 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.866995 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.867006 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5999f104-db61-4802-9aab-e5a762f77f30-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.867027 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.867035 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mkxp\" (UniqueName: \"kubernetes.io/projected/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-kube-api-access-2mkxp\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.867044 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.867052 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5xbj\" (UniqueName: \"kubernetes.io/projected/5999f104-db61-4802-9aab-e5a762f77f30-kube-api-access-v5xbj\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.886647 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 09 03:34:37 crc kubenswrapper[4766]: I1209 03:34:37.968639 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.046284 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.065297 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.076386 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:34:38 crc kubenswrapper[4766]: E1209 03:34:38.076811 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerName="glance-log" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.076833 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerName="glance-log" Dec 09 03:34:38 crc kubenswrapper[4766]: E1209 03:34:38.076845 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerName="glance-httpd" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.076854 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerName="glance-httpd" Dec 09 03:34:38 crc kubenswrapper[4766]: E1209 03:34:38.076875 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerName="glance-httpd" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.076884 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerName="glance-httpd" Dec 09 03:34:38 crc kubenswrapper[4766]: E1209 03:34:38.076906 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerName="glance-log" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.076924 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerName="glance-log" Dec 09 03:34:38 crc kubenswrapper[4766]: E1209 03:34:38.076952 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5999f104-db61-4802-9aab-e5a762f77f30" containerName="keystone-bootstrap" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.076960 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5999f104-db61-4802-9aab-e5a762f77f30" containerName="keystone-bootstrap" Dec 09 03:34:38 crc kubenswrapper[4766]: E1209 03:34:38.076974 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" containerName="init" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.076983 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" containerName="init" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.077180 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b2ce4f-2aaa-4e50-a46a-8e3b63522b65" containerName="init" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.077203 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerName="glance-httpd" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.077239 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerName="glance-log" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.077260 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5999f104-db61-4802-9aab-e5a762f77f30" containerName="keystone-bootstrap" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.077271 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" containerName="glance-httpd" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.077287 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" containerName="glance-log" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.078379 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.081251 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.081663 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.088848 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.173517 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.173559 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.173584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-config-data\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.173609 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.173643 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4qtm\" (UniqueName: \"kubernetes.io/projected/23eefead-e67a-4f5b-9a4b-f4506cd61c47-kube-api-access-d4qtm\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.173663 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-scripts\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.173688 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.173720 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-logs\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.274666 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-logs\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.275042 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.275073 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.275098 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-config-data\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.275129 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.275171 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4qtm\" (UniqueName: \"kubernetes.io/projected/23eefead-e67a-4f5b-9a4b-f4506cd61c47-kube-api-access-d4qtm\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.275187 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-logs\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.275198 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-scripts\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.275345 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.275788 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.277087 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.279751 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-scripts\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.280265 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.280533 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-config-data\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.281571 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.294551 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4qtm\" (UniqueName: \"kubernetes.io/projected/23eefead-e67a-4f5b-9a4b-f4506cd61c47-kube-api-access-d4qtm\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.303516 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.376643 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-internal-tls-certs\") pod \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\" (UID: \"f7cfa604-5696-4e5a-8fdc-5a110586bb6e\") " Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.380397 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f7cfa604-5696-4e5a-8fdc-5a110586bb6e" (UID: "f7cfa604-5696-4e5a-8fdc-5a110586bb6e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.399669 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.478945 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7cfa604-5696-4e5a-8fdc-5a110586bb6e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.656794 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.665193 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.679813 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.681078 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.683936 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.685031 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.704759 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.782965 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.783038 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.783099 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz4wc\" (UniqueName: \"kubernetes.io/projected/30e34685-6db8-46e4-9e31-9da18f1b408e-kube-api-access-tz4wc\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.783139 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.783156 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.783201 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-logs\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.783241 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.783258 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.793901 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bbtvk"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.814134 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bbtvk"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.875484 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5999f104-db61-4802-9aab-e5a762f77f30" path="/var/lib/kubelet/pods/5999f104-db61-4802-9aab-e5a762f77f30/volumes" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.876485 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4bf031-86cb-480f-9d17-abfae1f00bfb" path="/var/lib/kubelet/pods/6b4bf031-86cb-480f-9d17-abfae1f00bfb/volumes" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.877258 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7cfa604-5696-4e5a-8fdc-5a110586bb6e" path="/var/lib/kubelet/pods/f7cfa604-5696-4e5a-8fdc-5a110586bb6e/volumes" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.890254 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.890314 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.890375 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-logs\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.890406 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.890428 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.890482 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.890538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.890613 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz4wc\" (UniqueName: \"kubernetes.io/projected/30e34685-6db8-46e4-9e31-9da18f1b408e-kube-api-access-tz4wc\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.892054 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.896892 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.897060 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.897006 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.897781 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-logs\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.900990 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ltkbl"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.901569 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.902736 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.912453 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.912860 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.915959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.916157 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.916539 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.930332 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz4wc\" (UniqueName: \"kubernetes.io/projected/30e34685-6db8-46e4-9e31-9da18f1b408e-kube-api-access-tz4wc\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.930562 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-74qgj" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.934481 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ltkbl"] Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.961865 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.992407 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxlnr\" (UniqueName: \"kubernetes.io/projected/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-kube-api-access-rxlnr\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.992471 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-fernet-keys\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.992531 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-combined-ca-bundle\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.992578 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-scripts\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.992597 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-credential-keys\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.992677 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-config-data\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:38 crc kubenswrapper[4766]: I1209 03:34:38.999625 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.094608 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-config-data\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.094665 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxlnr\" (UniqueName: \"kubernetes.io/projected/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-kube-api-access-rxlnr\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.094693 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-fernet-keys\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.094751 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-combined-ca-bundle\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.094792 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-scripts\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.094808 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-credential-keys\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.098928 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-credential-keys\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.099943 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-combined-ca-bundle\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.102748 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-fernet-keys\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.112670 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-config-data\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.114724 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-scripts\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.128526 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxlnr\" (UniqueName: \"kubernetes.io/projected/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-kube-api-access-rxlnr\") pod \"keystone-bootstrap-ltkbl\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:39 crc kubenswrapper[4766]: I1209 03:34:39.341650 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:43 crc kubenswrapper[4766]: I1209 03:34:43.590548 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" podUID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Dec 09 03:34:48 crc kubenswrapper[4766]: E1209 03:34:48.189303 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 09 03:34:48 crc kubenswrapper[4766]: E1209 03:34:48.190905 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cmm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-zf6gw_openstack(6d08c90c-fad3-42ae-8950-8d57a79f9654): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:34:48 crc kubenswrapper[4766]: E1209 03:34:48.192535 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-zf6gw" podUID="6d08c90c-fad3-42ae-8950-8d57a79f9654" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.262661 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.325095 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-sb\") pod \"ba8836f7-6293-4cfc-a615-5c2517576d06\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.325155 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-nb\") pod \"ba8836f7-6293-4cfc-a615-5c2517576d06\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.325176 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-svc\") pod \"ba8836f7-6293-4cfc-a615-5c2517576d06\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.325280 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-config\") pod \"ba8836f7-6293-4cfc-a615-5c2517576d06\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.325330 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-swift-storage-0\") pod \"ba8836f7-6293-4cfc-a615-5c2517576d06\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.325377 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr6j9\" (UniqueName: \"kubernetes.io/projected/ba8836f7-6293-4cfc-a615-5c2517576d06-kube-api-access-nr6j9\") pod \"ba8836f7-6293-4cfc-a615-5c2517576d06\" (UID: \"ba8836f7-6293-4cfc-a615-5c2517576d06\") " Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.331303 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8836f7-6293-4cfc-a615-5c2517576d06-kube-api-access-nr6j9" (OuterVolumeSpecName: "kube-api-access-nr6j9") pod "ba8836f7-6293-4cfc-a615-5c2517576d06" (UID: "ba8836f7-6293-4cfc-a615-5c2517576d06"). InnerVolumeSpecName "kube-api-access-nr6j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.379741 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-config" (OuterVolumeSpecName: "config") pod "ba8836f7-6293-4cfc-a615-5c2517576d06" (UID: "ba8836f7-6293-4cfc-a615-5c2517576d06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.388491 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba8836f7-6293-4cfc-a615-5c2517576d06" (UID: "ba8836f7-6293-4cfc-a615-5c2517576d06"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.388811 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ba8836f7-6293-4cfc-a615-5c2517576d06" (UID: "ba8836f7-6293-4cfc-a615-5c2517576d06"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.394385 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba8836f7-6293-4cfc-a615-5c2517576d06" (UID: "ba8836f7-6293-4cfc-a615-5c2517576d06"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.402023 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba8836f7-6293-4cfc-a615-5c2517576d06" (UID: "ba8836f7-6293-4cfc-a615-5c2517576d06"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.427397 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr6j9\" (UniqueName: \"kubernetes.io/projected/ba8836f7-6293-4cfc-a615-5c2517576d06-kube-api-access-nr6j9\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.427442 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.427451 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.427461 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.427469 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.427478 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba8836f7-6293-4cfc-a615-5c2517576d06-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.591779 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" podUID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.835917 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.835921 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2bnr" event={"ID":"ba8836f7-6293-4cfc-a615-5c2517576d06","Type":"ContainerDied","Data":"bc1208b2eb4ef09dc7aa15f8a80aa7677e7024e330e34bd3968941c309667913"} Dec 09 03:34:48 crc kubenswrapper[4766]: E1209 03:34:48.837365 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-zf6gw" podUID="6d08c90c-fad3-42ae-8950-8d57a79f9654" Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.884411 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2bnr"] Dec 09 03:34:48 crc kubenswrapper[4766]: I1209 03:34:48.892491 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2bnr"] Dec 09 03:34:49 crc kubenswrapper[4766]: I1209 03:34:49.850626 4766 generic.go:334] "Generic (PLEG): container finished" podID="de52b533-348b-4fe0-b951-6dfe9c3f86e1" containerID="d10a4a20cfee2935e2676d326b1df6fbc314a11ef1613497c99fe595048642e5" exitCode=0 Dec 09 03:34:49 crc kubenswrapper[4766]: I1209 03:34:49.850670 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8dqt" event={"ID":"de52b533-348b-4fe0-b951-6dfe9c3f86e1","Type":"ContainerDied","Data":"d10a4a20cfee2935e2676d326b1df6fbc314a11ef1613497c99fe595048642e5"} Dec 09 03:34:50 crc kubenswrapper[4766]: I1209 03:34:50.750762 4766 scope.go:117] "RemoveContainer" containerID="8b462bf0e51192fbcc92f421784e85b9f7ac78f87898fa545457b8b86e638d4d" Dec 09 03:34:50 crc kubenswrapper[4766]: E1209 03:34:50.772609 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 09 03:34:50 crc kubenswrapper[4766]: E1209 03:34:50.772933 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vf6bf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-brzmt_openstack(4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 09 03:34:50 crc kubenswrapper[4766]: E1209 03:34:50.774490 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-brzmt" podUID="4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" Dec 09 03:34:50 crc kubenswrapper[4766]: I1209 03:34:50.877803 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8836f7-6293-4cfc-a615-5c2517576d06" path="/var/lib/kubelet/pods/ba8836f7-6293-4cfc-a615-5c2517576d06/volumes" Dec 09 03:34:50 crc kubenswrapper[4766]: E1209 03:34:50.883967 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-brzmt" podUID="4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" Dec 09 03:34:50 crc kubenswrapper[4766]: I1209 03:34:50.958797 4766 scope.go:117] "RemoveContainer" containerID="1152d06eee1332da2d90fba14d50483aa683ce65b6a240998a37a2c33cb42034" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.009793 4766 scope.go:117] "RemoveContainer" containerID="d37421ddaf5d43f00a607dafa3c68ca43e724521c9243c092e73b2b859c98fc0" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.039444 4766 scope.go:117] "RemoveContainer" containerID="d22e14114255fd970a5ecfdbdacf22b9307b534b5656bbb1c433a44410595da8" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.077522 4766 scope.go:117] "RemoveContainer" containerID="e48de07fac1174b5ff1df9ce6e893e8d72e10dac7ce267a9df19c35ce4af1d7f" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.317010 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.320416 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:34:51 crc kubenswrapper[4766]: W1209 03:34:51.327429 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e34685_6db8_46e4_9e31_9da18f1b408e.slice/crio-a0bd2a4bdf99d1e1c4b411b4b2e61f894e775b5f543f15b8c10014925015e337 WatchSource:0}: Error finding container a0bd2a4bdf99d1e1c4b411b4b2e61f894e775b5f543f15b8c10014925015e337: Status 404 returned error can't find the container with id a0bd2a4bdf99d1e1c4b411b4b2e61f894e775b5f543f15b8c10014925015e337 Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.368654 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ltkbl"] Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.384659 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-config\") pod \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.384724 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2qjh\" (UniqueName: \"kubernetes.io/projected/de52b533-348b-4fe0-b951-6dfe9c3f86e1-kube-api-access-q2qjh\") pod \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.384894 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-combined-ca-bundle\") pod \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\" (UID: \"de52b533-348b-4fe0-b951-6dfe9c3f86e1\") " Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.390219 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de52b533-348b-4fe0-b951-6dfe9c3f86e1-kube-api-access-q2qjh" (OuterVolumeSpecName: "kube-api-access-q2qjh") pod "de52b533-348b-4fe0-b951-6dfe9c3f86e1" (UID: "de52b533-348b-4fe0-b951-6dfe9c3f86e1"). InnerVolumeSpecName "kube-api-access-q2qjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.415326 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-config" (OuterVolumeSpecName: "config") pod "de52b533-348b-4fe0-b951-6dfe9c3f86e1" (UID: "de52b533-348b-4fe0-b951-6dfe9c3f86e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.421947 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de52b533-348b-4fe0-b951-6dfe9c3f86e1" (UID: "de52b533-348b-4fe0-b951-6dfe9c3f86e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.493043 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.493072 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2qjh\" (UniqueName: \"kubernetes.io/projected/de52b533-348b-4fe0-b951-6dfe9c3f86e1-kube-api-access-q2qjh\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.493082 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de52b533-348b-4fe0-b951-6dfe9c3f86e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.540701 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:34:51 crc kubenswrapper[4766]: W1209 03:34:51.542607 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23eefead_e67a_4f5b_9a4b_f4506cd61c47.slice/crio-6d9ab65940b4124adbd9232d30ffdbedd2b53cf54f3419a2201977483cc7cf20 WatchSource:0}: Error finding container 6d9ab65940b4124adbd9232d30ffdbedd2b53cf54f3419a2201977483cc7cf20: Status 404 returned error can't find the container with id 6d9ab65940b4124adbd9232d30ffdbedd2b53cf54f3419a2201977483cc7cf20 Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.884835 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8dqt" event={"ID":"de52b533-348b-4fe0-b951-6dfe9c3f86e1","Type":"ContainerDied","Data":"937b42ae94d6b3a763e0cdf3a771994aebb123eccce919c132c9b7da5473e463"} Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.885290 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="937b42ae94d6b3a763e0cdf3a771994aebb123eccce919c132c9b7da5473e463" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.885329 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8dqt" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.887999 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30e34685-6db8-46e4-9e31-9da18f1b408e","Type":"ContainerStarted","Data":"a0bd2a4bdf99d1e1c4b411b4b2e61f894e775b5f543f15b8c10014925015e337"} Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.907222 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ltkbl" event={"ID":"6280c589-1dc7-47f0-9c57-cfdc56dd28ee","Type":"ContainerStarted","Data":"32ecfe87a5271d6fd5dce2e9ea519ef68986440e73fe41076aca0515484e504d"} Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.907270 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ltkbl" event={"ID":"6280c589-1dc7-47f0-9c57-cfdc56dd28ee","Type":"ContainerStarted","Data":"c51b052ff9c0b631673e477691ded4e62554f2ce65ce8bef4d311fec945cf9f4"} Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.917402 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23eefead-e67a-4f5b-9a4b-f4506cd61c47","Type":"ContainerStarted","Data":"6d9ab65940b4124adbd9232d30ffdbedd2b53cf54f3419a2201977483cc7cf20"} Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.922141 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-97s8x" event={"ID":"51f111f0-cbfb-4181-ad96-30f654851763","Type":"ContainerStarted","Data":"2abec2b90461646367e56150d6bfc41963a577b60de43c6d0bc97c3b8d56a479"} Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.925885 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerStarted","Data":"690af035ecf71c8360e71ed1cfe8a13117268672add00219b1815b92e5272e1d"} Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.955562 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ltkbl" podStartSLOduration=13.955545187 podStartE2EDuration="13.955545187s" podCreationTimestamp="2025-12-09 03:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:51.937120531 +0000 UTC m=+1373.646425977" watchObservedRunningTime="2025-12-09 03:34:51.955545187 +0000 UTC m=+1373.664850613" Dec 09 03:34:51 crc kubenswrapper[4766]: I1209 03:34:51.960660 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-97s8x" podStartSLOduration=6.376351305 podStartE2EDuration="28.960643784s" podCreationTimestamp="2025-12-09 03:34:23 +0000 UTC" firstStartedPulling="2025-12-09 03:34:25.586679338 +0000 UTC m=+1347.295984764" lastFinishedPulling="2025-12-09 03:34:48.170971817 +0000 UTC m=+1369.880277243" observedRunningTime="2025-12-09 03:34:51.953267885 +0000 UTC m=+1373.662573301" watchObservedRunningTime="2025-12-09 03:34:51.960643784 +0000 UTC m=+1373.669949210" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.139116 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rpjpr"] Dec 09 03:34:52 crc kubenswrapper[4766]: E1209 03:34:52.139550 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerName="init" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.139572 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerName="init" Dec 09 03:34:52 crc kubenswrapper[4766]: E1209 03:34:52.139593 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerName="dnsmasq-dns" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.139600 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerName="dnsmasq-dns" Dec 09 03:34:52 crc kubenswrapper[4766]: E1209 03:34:52.139643 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de52b533-348b-4fe0-b951-6dfe9c3f86e1" containerName="neutron-db-sync" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.139651 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de52b533-348b-4fe0-b951-6dfe9c3f86e1" containerName="neutron-db-sync" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.139853 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="de52b533-348b-4fe0-b951-6dfe9c3f86e1" containerName="neutron-db-sync" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.139886 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8836f7-6293-4cfc-a615-5c2517576d06" containerName="dnsmasq-dns" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.141033 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.155771 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rpjpr"] Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.211564 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.211613 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.211677 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmpwn\" (UniqueName: \"kubernetes.io/projected/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-kube-api-access-qmpwn\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.211717 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.211759 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.211792 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-config\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.272968 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c7f64cdcb-c5qmq"] Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.282047 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.284246 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c7f64cdcb-c5qmq"] Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.285564 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.285718 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dwrn5" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.285830 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.285943 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313280 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313513 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313570 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-combined-ca-bundle\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313594 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmpwn\" (UniqueName: \"kubernetes.io/projected/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-kube-api-access-qmpwn\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313616 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-ovndb-tls-certs\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313640 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313681 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313706 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-httpd-config\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313729 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-config\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313746 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvhq4\" (UniqueName: \"kubernetes.io/projected/852701bf-d4f9-4b3f-a2a0-76253feafe4f-kube-api-access-xvhq4\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.313766 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-config\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.314173 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.314531 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.314825 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.315030 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.315073 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-config\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.360413 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmpwn\" (UniqueName: \"kubernetes.io/projected/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-kube-api-access-qmpwn\") pod \"dnsmasq-dns-84b966f6c9-rpjpr\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.415278 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-combined-ca-bundle\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.415350 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-ovndb-tls-certs\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.415465 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-httpd-config\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.415529 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvhq4\" (UniqueName: \"kubernetes.io/projected/852701bf-d4f9-4b3f-a2a0-76253feafe4f-kube-api-access-xvhq4\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.415566 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-config\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.419532 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-ovndb-tls-certs\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.419912 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-httpd-config\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.436311 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-config\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.436956 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-combined-ca-bundle\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.438896 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvhq4\" (UniqueName: \"kubernetes.io/projected/852701bf-d4f9-4b3f-a2a0-76253feafe4f-kube-api-access-xvhq4\") pod \"neutron-7c7f64cdcb-c5qmq\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.460419 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:52 crc kubenswrapper[4766]: I1209 03:34:52.616988 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:53 crc kubenswrapper[4766]: I1209 03:34:53.031408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23eefead-e67a-4f5b-9a4b-f4506cd61c47","Type":"ContainerStarted","Data":"e5ccc7b26989027c52549aa00ff8006413817ad50cca148e1450a061f0fc1451"} Dec 09 03:34:53 crc kubenswrapper[4766]: I1209 03:34:53.060976 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30e34685-6db8-46e4-9e31-9da18f1b408e","Type":"ContainerStarted","Data":"0a942c56c513f07e99a8589443f60f7079ef9d3894b9a9821bd7b1dfeda0999d"} Dec 09 03:34:53 crc kubenswrapper[4766]: I1209 03:34:53.067134 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rpjpr"] Dec 09 03:34:53 crc kubenswrapper[4766]: I1209 03:34:53.276571 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c7f64cdcb-c5qmq"] Dec 09 03:34:53 crc kubenswrapper[4766]: W1209 03:34:53.317948 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod852701bf_d4f9_4b3f_a2a0_76253feafe4f.slice/crio-c81a5e6a8b8029d731f2e63d2877e5ce9ac1f3e17939fde1fdae328106493274 WatchSource:0}: Error finding container c81a5e6a8b8029d731f2e63d2877e5ce9ac1f3e17939fde1fdae328106493274: Status 404 returned error can't find the container with id c81a5e6a8b8029d731f2e63d2877e5ce9ac1f3e17939fde1fdae328106493274 Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.106165 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23eefead-e67a-4f5b-9a4b-f4506cd61c47","Type":"ContainerStarted","Data":"6afd4036f7abfb99f49a00c85b2b033db21b537124d7e5e086b8000010db07c4"} Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.109919 4766 generic.go:334] "Generic (PLEG): container finished" podID="51f111f0-cbfb-4181-ad96-30f654851763" containerID="2abec2b90461646367e56150d6bfc41963a577b60de43c6d0bc97c3b8d56a479" exitCode=0 Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.110002 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-97s8x" event={"ID":"51f111f0-cbfb-4181-ad96-30f654851763","Type":"ContainerDied","Data":"2abec2b90461646367e56150d6bfc41963a577b60de43c6d0bc97c3b8d56a479"} Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.114513 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c7f64cdcb-c5qmq" event={"ID":"852701bf-d4f9-4b3f-a2a0-76253feafe4f","Type":"ContainerStarted","Data":"e166ebc5a4242345096ced25e89f65530d35ce71818720197b6d41e577df2045"} Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.114551 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c7f64cdcb-c5qmq" event={"ID":"852701bf-d4f9-4b3f-a2a0-76253feafe4f","Type":"ContainerStarted","Data":"79606849fff098e5f5d81ec4cb333c19fbcf0832408dee0e4a2616cb5d7c5c81"} Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.114560 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c7f64cdcb-c5qmq" event={"ID":"852701bf-d4f9-4b3f-a2a0-76253feafe4f","Type":"ContainerStarted","Data":"c81a5e6a8b8029d731f2e63d2877e5ce9ac1f3e17939fde1fdae328106493274"} Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.115302 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.120166 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30e34685-6db8-46e4-9e31-9da18f1b408e","Type":"ContainerStarted","Data":"80aab41f9fbbfac91a743bb7b665b82a67505e1d541d7f949b32bd177465f98e"} Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.126809 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerStarted","Data":"461930672d3ab54f27251bcecc76b609ffc06aafab7d479ace95b7d2ebe6d090"} Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.132888 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.132871278 podStartE2EDuration="16.132871278s" podCreationTimestamp="2025-12-09 03:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:54.126661931 +0000 UTC m=+1375.835967357" watchObservedRunningTime="2025-12-09 03:34:54.132871278 +0000 UTC m=+1375.842176704" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.144365 4766 generic.go:334] "Generic (PLEG): container finished" podID="4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" containerID="047671d421c3e3686a6bfde497ecc73c35b163e06508acad2b846fb0c5c88ec4" exitCode=0 Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.144413 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" event={"ID":"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987","Type":"ContainerDied","Data":"047671d421c3e3686a6bfde497ecc73c35b163e06508acad2b846fb0c5c88ec4"} Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.144433 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" event={"ID":"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987","Type":"ContainerStarted","Data":"a8b97bac148e3be5931844a801ba200413f2d694a9e09ea69ac0642b4e345fd4"} Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.169586 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c7f64cdcb-c5qmq" podStartSLOduration=2.169566786 podStartE2EDuration="2.169566786s" podCreationTimestamp="2025-12-09 03:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:54.150544374 +0000 UTC m=+1375.859849810" watchObservedRunningTime="2025-12-09 03:34:54.169566786 +0000 UTC m=+1375.878872212" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.221563 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.221541495 podStartE2EDuration="16.221541495s" podCreationTimestamp="2025-12-09 03:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:54.201341501 +0000 UTC m=+1375.910646927" watchObservedRunningTime="2025-12-09 03:34:54.221541495 +0000 UTC m=+1375.930846921" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.801665 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-649b876b57-9jd5p"] Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.803972 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.807465 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.807564 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.819506 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-649b876b57-9jd5p"] Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.966509 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-combined-ca-bundle\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.966559 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-internal-tls-certs\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.966682 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-config\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.966724 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-public-tls-certs\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.967241 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-ovndb-tls-certs\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.967652 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-httpd-config\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:54 crc kubenswrapper[4766]: I1209 03:34:54.967671 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tplz2\" (UniqueName: \"kubernetes.io/projected/be23a05e-591f-4bdf-9c5f-8ee930181397-kube-api-access-tplz2\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.070250 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-public-tls-certs\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.070409 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-ovndb-tls-certs\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.070466 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-httpd-config\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.070490 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tplz2\" (UniqueName: \"kubernetes.io/projected/be23a05e-591f-4bdf-9c5f-8ee930181397-kube-api-access-tplz2\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.070531 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-combined-ca-bundle\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.070555 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-internal-tls-certs\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.070612 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-config\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.077284 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-internal-tls-certs\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.078907 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-public-tls-certs\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.080033 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-ovndb-tls-certs\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.080660 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-combined-ca-bundle\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.080963 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-httpd-config\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.088346 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-config\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.090645 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tplz2\" (UniqueName: \"kubernetes.io/projected/be23a05e-591f-4bdf-9c5f-8ee930181397-kube-api-access-tplz2\") pod \"neutron-649b876b57-9jd5p\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.124610 4766 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod3b85b07b-4055-48f6-8ce2-30fa4ec3539a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod3b85b07b-4055-48f6-8ce2-30fa4ec3539a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3b85b07b_4055_48f6_8ce2_30fa4ec3539a.slice" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.141722 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.160097 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" event={"ID":"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987","Type":"ContainerStarted","Data":"03be38dd0db64bfa4381e561ed96be54602652d0daaabe06c972e7098a6d9505"} Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.163888 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.188366 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" podStartSLOduration=3.188349216 podStartE2EDuration="3.188349216s" podCreationTimestamp="2025-12-09 03:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:34:55.185983082 +0000 UTC m=+1376.895288508" watchObservedRunningTime="2025-12-09 03:34:55.188349216 +0000 UTC m=+1376.897654632" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.499382 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.579501 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f111f0-cbfb-4181-ad96-30f654851763-logs\") pod \"51f111f0-cbfb-4181-ad96-30f654851763\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.579546 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-combined-ca-bundle\") pod \"51f111f0-cbfb-4181-ad96-30f654851763\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.579644 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-config-data\") pod \"51f111f0-cbfb-4181-ad96-30f654851763\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.579747 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-scripts\") pod \"51f111f0-cbfb-4181-ad96-30f654851763\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.579774 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpc69\" (UniqueName: \"kubernetes.io/projected/51f111f0-cbfb-4181-ad96-30f654851763-kube-api-access-mpc69\") pod \"51f111f0-cbfb-4181-ad96-30f654851763\" (UID: \"51f111f0-cbfb-4181-ad96-30f654851763\") " Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.585877 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f111f0-cbfb-4181-ad96-30f654851763-logs" (OuterVolumeSpecName: "logs") pod "51f111f0-cbfb-4181-ad96-30f654851763" (UID: "51f111f0-cbfb-4181-ad96-30f654851763"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.587238 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f111f0-cbfb-4181-ad96-30f654851763-kube-api-access-mpc69" (OuterVolumeSpecName: "kube-api-access-mpc69") pod "51f111f0-cbfb-4181-ad96-30f654851763" (UID: "51f111f0-cbfb-4181-ad96-30f654851763"). InnerVolumeSpecName "kube-api-access-mpc69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.590084 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-scripts" (OuterVolumeSpecName: "scripts") pod "51f111f0-cbfb-4181-ad96-30f654851763" (UID: "51f111f0-cbfb-4181-ad96-30f654851763"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.613324 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-config-data" (OuterVolumeSpecName: "config-data") pod "51f111f0-cbfb-4181-ad96-30f654851763" (UID: "51f111f0-cbfb-4181-ad96-30f654851763"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.626104 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51f111f0-cbfb-4181-ad96-30f654851763" (UID: "51f111f0-cbfb-4181-ad96-30f654851763"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.684665 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpc69\" (UniqueName: \"kubernetes.io/projected/51f111f0-cbfb-4181-ad96-30f654851763-kube-api-access-mpc69\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.684693 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f111f0-cbfb-4181-ad96-30f654851763-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.684703 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.684711 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.684719 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f111f0-cbfb-4181-ad96-30f654851763-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:55 crc kubenswrapper[4766]: I1209 03:34:55.836124 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-649b876b57-9jd5p"] Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.166697 4766 generic.go:334] "Generic (PLEG): container finished" podID="6280c589-1dc7-47f0-9c57-cfdc56dd28ee" containerID="32ecfe87a5271d6fd5dce2e9ea519ef68986440e73fe41076aca0515484e504d" exitCode=0 Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.166791 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ltkbl" event={"ID":"6280c589-1dc7-47f0-9c57-cfdc56dd28ee","Type":"ContainerDied","Data":"32ecfe87a5271d6fd5dce2e9ea519ef68986440e73fe41076aca0515484e504d"} Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.171444 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-97s8x" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.171499 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-97s8x" event={"ID":"51f111f0-cbfb-4181-ad96-30f654851763","Type":"ContainerDied","Data":"3847638c2032a1b0e75b94451976e9f3118e4f23a705dc90c2d763a39bd3eb50"} Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.173357 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3847638c2032a1b0e75b94451976e9f3118e4f23a705dc90c2d763a39bd3eb50" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.307500 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9f669bd74-mz8xh"] Dec 09 03:34:56 crc kubenswrapper[4766]: E1209 03:34:56.308013 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f111f0-cbfb-4181-ad96-30f654851763" containerName="placement-db-sync" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.308038 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f111f0-cbfb-4181-ad96-30f654851763" containerName="placement-db-sync" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.308287 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f111f0-cbfb-4181-ad96-30f654851763" containerName="placement-db-sync" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.309244 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.318763 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.320013 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.320144 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f669bd74-mz8xh"] Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.320302 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.321109 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ngs5s" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.321366 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.410806 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-public-tls-certs\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.411167 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvwz7\" (UniqueName: \"kubernetes.io/projected/417726b2-75fd-4efc-84ec-803533df86aa-kube-api-access-fvwz7\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.411204 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-scripts\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.411237 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/417726b2-75fd-4efc-84ec-803533df86aa-logs\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.411464 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-internal-tls-certs\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.411633 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-combined-ca-bundle\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.411759 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-config-data\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.512875 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-combined-ca-bundle\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.512942 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-config-data\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.512984 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-public-tls-certs\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.513025 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvwz7\" (UniqueName: \"kubernetes.io/projected/417726b2-75fd-4efc-84ec-803533df86aa-kube-api-access-fvwz7\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.513049 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-scripts\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.513063 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/417726b2-75fd-4efc-84ec-803533df86aa-logs\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.513121 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-internal-tls-certs\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.516165 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/417726b2-75fd-4efc-84ec-803533df86aa-logs\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.519577 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-scripts\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.519763 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-public-tls-certs\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.519864 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-combined-ca-bundle\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.520462 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-config-data\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.521609 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-internal-tls-certs\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.536157 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvwz7\" (UniqueName: \"kubernetes.io/projected/417726b2-75fd-4efc-84ec-803533df86aa-kube-api-access-fvwz7\") pod \"placement-9f669bd74-mz8xh\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:56 crc kubenswrapper[4766]: I1209 03:34:56.642675 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.877250 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.948195 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-combined-ca-bundle\") pod \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.948375 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-scripts\") pod \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.948543 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-fernet-keys\") pod \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.948682 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-credential-keys\") pod \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.948756 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxlnr\" (UniqueName: \"kubernetes.io/projected/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-kube-api-access-rxlnr\") pod \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.948996 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-config-data\") pod \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\" (UID: \"6280c589-1dc7-47f0-9c57-cfdc56dd28ee\") " Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.957356 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6280c589-1dc7-47f0-9c57-cfdc56dd28ee" (UID: "6280c589-1dc7-47f0-9c57-cfdc56dd28ee"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.958338 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-scripts" (OuterVolumeSpecName: "scripts") pod "6280c589-1dc7-47f0-9c57-cfdc56dd28ee" (UID: "6280c589-1dc7-47f0-9c57-cfdc56dd28ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.969262 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-kube-api-access-rxlnr" (OuterVolumeSpecName: "kube-api-access-rxlnr") pod "6280c589-1dc7-47f0-9c57-cfdc56dd28ee" (UID: "6280c589-1dc7-47f0-9c57-cfdc56dd28ee"). InnerVolumeSpecName "kube-api-access-rxlnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.969792 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6280c589-1dc7-47f0-9c57-cfdc56dd28ee" (UID: "6280c589-1dc7-47f0-9c57-cfdc56dd28ee"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:57 crc kubenswrapper[4766]: I1209 03:34:57.984039 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6280c589-1dc7-47f0-9c57-cfdc56dd28ee" (UID: "6280c589-1dc7-47f0-9c57-cfdc56dd28ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.006155 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-config-data" (OuterVolumeSpecName: "config-data") pod "6280c589-1dc7-47f0-9c57-cfdc56dd28ee" (UID: "6280c589-1dc7-47f0-9c57-cfdc56dd28ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.051200 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.051249 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.051258 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.051269 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.051277 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxlnr\" (UniqueName: \"kubernetes.io/projected/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-kube-api-access-rxlnr\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.051287 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6280c589-1dc7-47f0-9c57-cfdc56dd28ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.190345 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ltkbl" event={"ID":"6280c589-1dc7-47f0-9c57-cfdc56dd28ee","Type":"ContainerDied","Data":"c51b052ff9c0b631673e477691ded4e62554f2ce65ce8bef4d311fec945cf9f4"} Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.190388 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c51b052ff9c0b631673e477691ded4e62554f2ce65ce8bef4d311fec945cf9f4" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.190491 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ltkbl" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.279354 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5fdb76977-cn2gb"] Dec 09 03:34:58 crc kubenswrapper[4766]: E1209 03:34:58.279803 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6280c589-1dc7-47f0-9c57-cfdc56dd28ee" containerName="keystone-bootstrap" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.279825 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6280c589-1dc7-47f0-9c57-cfdc56dd28ee" containerName="keystone-bootstrap" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.280051 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6280c589-1dc7-47f0-9c57-cfdc56dd28ee" containerName="keystone-bootstrap" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.281669 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.286262 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.286280 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.286701 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.286984 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.287114 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.287254 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-74qgj" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.289881 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fdb76977-cn2gb"] Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.356958 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zq2b\" (UniqueName: \"kubernetes.io/projected/18f03bec-d533-450d-b79b-7f19dc436d94-kube-api-access-2zq2b\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.357017 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-config-data\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.357082 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-public-tls-certs\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.357130 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-scripts\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.357151 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-fernet-keys\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.357178 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-internal-tls-certs\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.357203 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-combined-ca-bundle\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.357270 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-credential-keys\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.400707 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.400753 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.433603 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.448139 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.459900 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-scripts\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.460038 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-fernet-keys\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.460067 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-internal-tls-certs\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.460268 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-combined-ca-bundle\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.460415 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-credential-keys\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.461268 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zq2b\" (UniqueName: \"kubernetes.io/projected/18f03bec-d533-450d-b79b-7f19dc436d94-kube-api-access-2zq2b\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.461436 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-config-data\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.461544 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-public-tls-certs\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.463558 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-internal-tls-certs\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.463667 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-scripts\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.464487 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-fernet-keys\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.465076 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-config-data\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.465565 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-credential-keys\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.467891 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-combined-ca-bundle\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.468341 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-public-tls-certs\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.490130 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zq2b\" (UniqueName: \"kubernetes.io/projected/18f03bec-d533-450d-b79b-7f19dc436d94-kube-api-access-2zq2b\") pod \"keystone-5fdb76977-cn2gb\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:58 crc kubenswrapper[4766]: I1209 03:34:58.616437 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:34:59 crc kubenswrapper[4766]: I1209 03:34:58.999809 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:59 crc kubenswrapper[4766]: I1209 03:34:59.000078 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:59 crc kubenswrapper[4766]: I1209 03:34:59.027690 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:59 crc kubenswrapper[4766]: I1209 03:34:59.036821 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:59 crc kubenswrapper[4766]: I1209 03:34:59.200139 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:59 crc kubenswrapper[4766]: I1209 03:34:59.200171 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 03:34:59 crc kubenswrapper[4766]: I1209 03:34:59.200313 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 03:34:59 crc kubenswrapper[4766]: I1209 03:34:59.200327 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.164987 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9f669bd74-mz8xh"] Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.234288 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f669bd74-mz8xh" event={"ID":"417726b2-75fd-4efc-84ec-803533df86aa","Type":"ContainerStarted","Data":"d991dad51a32ef6a3113dae9632673595f4c0d7407f9c9ca225f6f7068803079"} Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.237684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerStarted","Data":"dae354cbcd90e6ab1b02efec63ffdd4765c865493e500918a2955fe6bf02dd05"} Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.239134 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649b876b57-9jd5p" event={"ID":"be23a05e-591f-4bdf-9c5f-8ee930181397","Type":"ContainerStarted","Data":"7d778b40fe61554f26c1ca0d67e81146904e79a7748a3d0c87bbd49972278c52"} Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.239163 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649b876b57-9jd5p" event={"ID":"be23a05e-591f-4bdf-9c5f-8ee930181397","Type":"ContainerStarted","Data":"1ae9831093f4629cfe9946e8fa58ca65550ce675bd1fdbc21ffcbd735654d2e5"} Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.490007 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fdb76977-cn2gb"] Dec 09 03:35:01 crc kubenswrapper[4766]: W1209 03:35:01.490411 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18f03bec_d533_450d_b79b_7f19dc436d94.slice/crio-d1ef62b942ff38e301503c2836131d0103803379b2c2058684505eb228a90159 WatchSource:0}: Error finding container d1ef62b942ff38e301503c2836131d0103803379b2c2058684505eb228a90159: Status 404 returned error can't find the container with id d1ef62b942ff38e301503c2836131d0103803379b2c2058684505eb228a90159 Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.531185 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.531350 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.659983 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.660090 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.664234 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:01 crc kubenswrapper[4766]: I1209 03:35:01.669155 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.247924 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649b876b57-9jd5p" event={"ID":"be23a05e-591f-4bdf-9c5f-8ee930181397","Type":"ContainerStarted","Data":"17927a86e4710dcdbcd88c8f33269d8edc8c5332c71c20da2f6297f153f510c1"} Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.248237 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.249701 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdb76977-cn2gb" event={"ID":"18f03bec-d533-450d-b79b-7f19dc436d94","Type":"ContainerStarted","Data":"425a19b5cf582ca012bbb5a8e1a477b968972c4db75ac0ca49551bb4ae484803"} Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.249736 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdb76977-cn2gb" event={"ID":"18f03bec-d533-450d-b79b-7f19dc436d94","Type":"ContainerStarted","Data":"d1ef62b942ff38e301503c2836131d0103803379b2c2058684505eb228a90159"} Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.249900 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.252192 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f669bd74-mz8xh" event={"ID":"417726b2-75fd-4efc-84ec-803533df86aa","Type":"ContainerStarted","Data":"556527868dac8d175c6e11bca57d484310691f25ac252842802df8da12d3c8f3"} Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.252253 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f669bd74-mz8xh" event={"ID":"417726b2-75fd-4efc-84ec-803533df86aa","Type":"ContainerStarted","Data":"c3015d64e08f82a88af0129948492bc3c6ad857b57c9483bd887eeb673aa3dc3"} Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.252346 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.252674 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.269092 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-649b876b57-9jd5p" podStartSLOduration=8.269068749 podStartE2EDuration="8.269068749s" podCreationTimestamp="2025-12-09 03:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:02.264086195 +0000 UTC m=+1383.973391621" watchObservedRunningTime="2025-12-09 03:35:02.269068749 +0000 UTC m=+1383.978374175" Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.289513 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9f669bd74-mz8xh" podStartSLOduration=6.289496978 podStartE2EDuration="6.289496978s" podCreationTimestamp="2025-12-09 03:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:02.28399775 +0000 UTC m=+1383.993303176" watchObservedRunningTime="2025-12-09 03:35:02.289496978 +0000 UTC m=+1383.998802404" Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.305597 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5fdb76977-cn2gb" podStartSLOduration=4.305580662 podStartE2EDuration="4.305580662s" podCreationTimestamp="2025-12-09 03:34:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:02.300255588 +0000 UTC m=+1384.009561024" watchObservedRunningTime="2025-12-09 03:35:02.305580662 +0000 UTC m=+1384.014886088" Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.462420 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.526155 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-24bc9"] Dec 09 03:35:02 crc kubenswrapper[4766]: I1209 03:35:02.526424 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" podUID="fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" containerName="dnsmasq-dns" containerID="cri-o://1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45" gracePeriod=10 Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.053091 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.163560 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-swift-storage-0\") pod \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.163634 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-sb\") pod \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.163749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7qgv\" (UniqueName: \"kubernetes.io/projected/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-kube-api-access-l7qgv\") pod \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.163773 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-svc\") pod \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.163803 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-nb\") pod \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.163892 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-config\") pod \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\" (UID: \"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb\") " Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.169722 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-kube-api-access-l7qgv" (OuterVolumeSpecName: "kube-api-access-l7qgv") pod "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" (UID: "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb"). InnerVolumeSpecName "kube-api-access-l7qgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.218873 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" (UID: "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.224976 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" (UID: "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.231302 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-config" (OuterVolumeSpecName: "config") pod "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" (UID: "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.233687 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" (UID: "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.263294 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" (UID: "fe22f23a-5dcd-4bae-9d03-e0873c6e22cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.266104 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.266132 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.266149 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.266161 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7qgv\" (UniqueName: \"kubernetes.io/projected/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-kube-api-access-l7qgv\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.266173 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.266183 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.272696 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zf6gw" event={"ID":"6d08c90c-fad3-42ae-8950-8d57a79f9654","Type":"ContainerStarted","Data":"72245693cd64770006d3527802bc4f15e68d9c06f27cb3116690cee7671021ae"} Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.285310 4766 generic.go:334] "Generic (PLEG): container finished" podID="fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" containerID="1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45" exitCode=0 Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.286120 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.287455 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" event={"ID":"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb","Type":"ContainerDied","Data":"1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45"} Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.287536 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-24bc9" event={"ID":"fe22f23a-5dcd-4bae-9d03-e0873c6e22cb","Type":"ContainerDied","Data":"164f27c00b0172834faa7b55e0fcc54ce5cf6abfac5656d70823f49ec7955b94"} Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.287593 4766 scope.go:117] "RemoveContainer" containerID="1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.294657 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-zf6gw" podStartSLOduration=3.442445281 podStartE2EDuration="40.294645152s" podCreationTimestamp="2025-12-09 03:34:23 +0000 UTC" firstStartedPulling="2025-12-09 03:34:25.590722917 +0000 UTC m=+1347.300028343" lastFinishedPulling="2025-12-09 03:35:02.442922788 +0000 UTC m=+1384.152228214" observedRunningTime="2025-12-09 03:35:03.292133384 +0000 UTC m=+1385.001438810" watchObservedRunningTime="2025-12-09 03:35:03.294645152 +0000 UTC m=+1385.003950578" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.325166 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-24bc9"] Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.327517 4766 scope.go:117] "RemoveContainer" containerID="52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.332074 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-24bc9"] Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.346430 4766 scope.go:117] "RemoveContainer" containerID="1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45" Dec 09 03:35:03 crc kubenswrapper[4766]: E1209 03:35:03.346765 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45\": container with ID starting with 1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45 not found: ID does not exist" containerID="1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.346807 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45"} err="failed to get container status \"1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45\": rpc error: code = NotFound desc = could not find container \"1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45\": container with ID starting with 1259d7a2e61ed0963181e633d6d4c1eaea616c93429a9b2000343395d6283a45 not found: ID does not exist" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.346834 4766 scope.go:117] "RemoveContainer" containerID="52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1" Dec 09 03:35:03 crc kubenswrapper[4766]: E1209 03:35:03.347612 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1\": container with ID starting with 52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1 not found: ID does not exist" containerID="52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1" Dec 09 03:35:03 crc kubenswrapper[4766]: I1209 03:35:03.347645 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1"} err="failed to get container status \"52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1\": rpc error: code = NotFound desc = could not find container \"52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1\": container with ID starting with 52ec18feb7dcf0d0104ce0f3372e8aaabddd22177b25fe14ecc4f516f9a917b1 not found: ID does not exist" Dec 09 03:35:04 crc kubenswrapper[4766]: I1209 03:35:04.864790 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" path="/var/lib/kubelet/pods/fe22f23a-5dcd-4bae-9d03-e0873c6e22cb/volumes" Dec 09 03:35:05 crc kubenswrapper[4766]: I1209 03:35:05.310604 4766 generic.go:334] "Generic (PLEG): container finished" podID="6d08c90c-fad3-42ae-8950-8d57a79f9654" containerID="72245693cd64770006d3527802bc4f15e68d9c06f27cb3116690cee7671021ae" exitCode=0 Dec 09 03:35:05 crc kubenswrapper[4766]: I1209 03:35:05.310665 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zf6gw" event={"ID":"6d08c90c-fad3-42ae-8950-8d57a79f9654","Type":"ContainerDied","Data":"72245693cd64770006d3527802bc4f15e68d9c06f27cb3116690cee7671021ae"} Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.071159 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.143849 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cmm8\" (UniqueName: \"kubernetes.io/projected/6d08c90c-fad3-42ae-8950-8d57a79f9654-kube-api-access-8cmm8\") pod \"6d08c90c-fad3-42ae-8950-8d57a79f9654\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.144373 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-combined-ca-bundle\") pod \"6d08c90c-fad3-42ae-8950-8d57a79f9654\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.144725 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-db-sync-config-data\") pod \"6d08c90c-fad3-42ae-8950-8d57a79f9654\" (UID: \"6d08c90c-fad3-42ae-8950-8d57a79f9654\") " Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.152224 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d08c90c-fad3-42ae-8950-8d57a79f9654" (UID: "6d08c90c-fad3-42ae-8950-8d57a79f9654"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.152700 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d08c90c-fad3-42ae-8950-8d57a79f9654-kube-api-access-8cmm8" (OuterVolumeSpecName: "kube-api-access-8cmm8") pod "6d08c90c-fad3-42ae-8950-8d57a79f9654" (UID: "6d08c90c-fad3-42ae-8950-8d57a79f9654"). InnerVolumeSpecName "kube-api-access-8cmm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.185163 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d08c90c-fad3-42ae-8950-8d57a79f9654" (UID: "6d08c90c-fad3-42ae-8950-8d57a79f9654"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.247053 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.247085 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cmm8\" (UniqueName: \"kubernetes.io/projected/6d08c90c-fad3-42ae-8950-8d57a79f9654-kube-api-access-8cmm8\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.247096 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d08c90c-fad3-42ae-8950-8d57a79f9654-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.343989 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zf6gw" event={"ID":"6d08c90c-fad3-42ae-8950-8d57a79f9654","Type":"ContainerDied","Data":"86430353afd214cacdb01d48b5f42087d90dda8d88daa52bbeb2b100402f2c49"} Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.344033 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86430353afd214cacdb01d48b5f42087d90dda8d88daa52bbeb2b100402f2c49" Dec 09 03:35:08 crc kubenswrapper[4766]: I1209 03:35:08.344110 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zf6gw" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.358916 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7b4dbbdb47-q4g54"] Dec 09 03:35:09 crc kubenswrapper[4766]: E1209 03:35:09.359542 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" containerName="dnsmasq-dns" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.359559 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" containerName="dnsmasq-dns" Dec 09 03:35:09 crc kubenswrapper[4766]: E1209 03:35:09.359586 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d08c90c-fad3-42ae-8950-8d57a79f9654" containerName="barbican-db-sync" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.359592 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d08c90c-fad3-42ae-8950-8d57a79f9654" containerName="barbican-db-sync" Dec 09 03:35:09 crc kubenswrapper[4766]: E1209 03:35:09.359611 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" containerName="init" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.359617 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" containerName="init" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.359807 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe22f23a-5dcd-4bae-9d03-e0873c6e22cb" containerName="dnsmasq-dns" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.359829 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d08c90c-fad3-42ae-8950-8d57a79f9654" containerName="barbican-db-sync" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.360731 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.368746 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.372706 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.380103 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-slkfh" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.384740 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="ceilometer-central-agent" containerID="cri-o://690af035ecf71c8360e71ed1cfe8a13117268672add00219b1815b92e5272e1d" gracePeriod=30 Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.384906 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="proxy-httpd" containerID="cri-o://bcae299d720102c7510509f79675b9eb46193c5f57ef3edc5f368e610f290e4c" gracePeriod=30 Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.384963 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="sg-core" containerID="cri-o://dae354cbcd90e6ab1b02efec63ffdd4765c865493e500918a2955fe6bf02dd05" gracePeriod=30 Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.385015 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="ceilometer-notification-agent" containerID="cri-o://461930672d3ab54f27251bcecc76b609ffc06aafab7d479ace95b7d2ebe6d090" gracePeriod=30 Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.385507 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerStarted","Data":"bcae299d720102c7510509f79675b9eb46193c5f57ef3edc5f368e610f290e4c"} Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.385543 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.401334 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-59f4fbb654-hrpnd"] Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.403251 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.407132 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-brzmt" event={"ID":"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc","Type":"ContainerStarted","Data":"ea2d873adb8b8f0bf3c7da1db055698c2ecd4add0b416c79010a80d08ed56eb9"} Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.431920 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b4dbbdb47-q4g54"] Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.440228 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.468277 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-59f4fbb654-hrpnd"] Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479057 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae0a18c-f118-45b5-8989-9ca3a49827ad-logs\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479137 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data-custom\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479254 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479306 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479344 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data-custom\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479369 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2zpw\" (UniqueName: \"kubernetes.io/projected/04522868-a66d-44f8-a9bb-6f157f26653f-kube-api-access-d2zpw\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479399 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04522868-a66d-44f8-a9bb-6f157f26653f-logs\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479432 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-combined-ca-bundle\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479462 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn7bj\" (UniqueName: \"kubernetes.io/projected/7ae0a18c-f118-45b5-8989-9ca3a49827ad-kube-api-access-pn7bj\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.479503 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-combined-ca-bundle\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.515346 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-zxhf5"] Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.519481 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.550320 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-zxhf5"] Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585107 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn7bj\" (UniqueName: \"kubernetes.io/projected/7ae0a18c-f118-45b5-8989-9ca3a49827ad-kube-api-access-pn7bj\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585151 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-combined-ca-bundle\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585190 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae0a18c-f118-45b5-8989-9ca3a49827ad-logs\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585223 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64mf8\" (UniqueName: \"kubernetes.io/projected/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-kube-api-access-64mf8\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585258 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data-custom\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585298 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585329 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585348 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-config\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585373 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585393 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585413 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data-custom\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585432 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2zpw\" (UniqueName: \"kubernetes.io/projected/04522868-a66d-44f8-a9bb-6f157f26653f-kube-api-access-d2zpw\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585448 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585468 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04522868-a66d-44f8-a9bb-6f157f26653f-logs\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585485 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.585505 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-combined-ca-bundle\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.597418 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04522868-a66d-44f8-a9bb-6f157f26653f-logs\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.600175 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-combined-ca-bundle\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.600729 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae0a18c-f118-45b5-8989-9ca3a49827ad-logs\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.601867 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.03961174 podStartE2EDuration="46.601853646s" podCreationTimestamp="2025-12-09 03:34:23 +0000 UTC" firstStartedPulling="2025-12-09 03:34:25.072299775 +0000 UTC m=+1346.781605201" lastFinishedPulling="2025-12-09 03:35:08.634541681 +0000 UTC m=+1390.343847107" observedRunningTime="2025-12-09 03:35:09.529721295 +0000 UTC m=+1391.239026731" watchObservedRunningTime="2025-12-09 03:35:09.601853646 +0000 UTC m=+1391.311159072" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.605761 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data-custom\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.610284 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.613648 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-brzmt" podStartSLOduration=3.240519924 podStartE2EDuration="46.613630013s" podCreationTimestamp="2025-12-09 03:34:23 +0000 UTC" firstStartedPulling="2025-12-09 03:34:25.239848161 +0000 UTC m=+1346.949153587" lastFinishedPulling="2025-12-09 03:35:08.61295824 +0000 UTC m=+1390.322263676" observedRunningTime="2025-12-09 03:35:09.55740691 +0000 UTC m=+1391.266712356" watchObservedRunningTime="2025-12-09 03:35:09.613630013 +0000 UTC m=+1391.322935439" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.614318 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-combined-ca-bundle\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.618148 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data-custom\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.619039 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.636520 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn7bj\" (UniqueName: \"kubernetes.io/projected/7ae0a18c-f118-45b5-8989-9ca3a49827ad-kube-api-access-pn7bj\") pod \"barbican-keystone-listener-59f4fbb654-hrpnd\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.668872 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2zpw\" (UniqueName: \"kubernetes.io/projected/04522868-a66d-44f8-a9bb-6f157f26653f-kube-api-access-d2zpw\") pod \"barbican-worker-7b4dbbdb47-q4g54\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.688405 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.688461 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-config\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.688485 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.688510 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.688533 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.688583 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64mf8\" (UniqueName: \"kubernetes.io/projected/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-kube-api-access-64mf8\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.690072 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.690356 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.690793 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.691378 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-config\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.691499 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.691981 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.723853 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64mf8\" (UniqueName: \"kubernetes.io/projected/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-kube-api-access-64mf8\") pod \"dnsmasq-dns-75c8ddd69c-zxhf5\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.723948 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-584546f8-9kq4r"] Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.725824 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.728740 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.740544 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-584546f8-9kq4r"] Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.766412 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.792183 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfb02f25-f0c2-460b-99f7-8ae10815c016-logs\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.792292 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-combined-ca-bundle\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.792396 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.792443 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbgs\" (UniqueName: \"kubernetes.io/projected/dfb02f25-f0c2-460b-99f7-8ae10815c016-kube-api-access-tzbgs\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.792521 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data-custom\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.892622 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.909440 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfb02f25-f0c2-460b-99f7-8ae10815c016-logs\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.909513 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-combined-ca-bundle\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.909620 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.909668 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbgs\" (UniqueName: \"kubernetes.io/projected/dfb02f25-f0c2-460b-99f7-8ae10815c016-kube-api-access-tzbgs\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.909736 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data-custom\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.911408 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfb02f25-f0c2-460b-99f7-8ae10815c016-logs\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.924241 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-combined-ca-bundle\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.925404 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.925977 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data-custom\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:09 crc kubenswrapper[4766]: I1209 03:35:09.940416 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbgs\" (UniqueName: \"kubernetes.io/projected/dfb02f25-f0c2-460b-99f7-8ae10815c016-kube-api-access-tzbgs\") pod \"barbican-api-584546f8-9kq4r\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.083950 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:10 crc kubenswrapper[4766]: W1209 03:35:10.237766 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04522868_a66d_44f8_a9bb_6f157f26653f.slice/crio-b807a1c23dc3b78a2932a9d12cb33a247ec78fbd973ff078dc74c3646076ffb2 WatchSource:0}: Error finding container b807a1c23dc3b78a2932a9d12cb33a247ec78fbd973ff078dc74c3646076ffb2: Status 404 returned error can't find the container with id b807a1c23dc3b78a2932a9d12cb33a247ec78fbd973ff078dc74c3646076ffb2 Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.241637 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b4dbbdb47-q4g54"] Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.423368 4766 generic.go:334] "Generic (PLEG): container finished" podID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerID="bcae299d720102c7510509f79675b9eb46193c5f57ef3edc5f368e610f290e4c" exitCode=0 Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.423693 4766 generic.go:334] "Generic (PLEG): container finished" podID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerID="dae354cbcd90e6ab1b02efec63ffdd4765c865493e500918a2955fe6bf02dd05" exitCode=2 Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.423706 4766 generic.go:334] "Generic (PLEG): container finished" podID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerID="690af035ecf71c8360e71ed1cfe8a13117268672add00219b1815b92e5272e1d" exitCode=0 Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.423572 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerDied","Data":"bcae299d720102c7510509f79675b9eb46193c5f57ef3edc5f368e610f290e4c"} Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.423789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerDied","Data":"dae354cbcd90e6ab1b02efec63ffdd4765c865493e500918a2955fe6bf02dd05"} Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.423803 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerDied","Data":"690af035ecf71c8360e71ed1cfe8a13117268672add00219b1815b92e5272e1d"} Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.426852 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" event={"ID":"04522868-a66d-44f8-a9bb-6f157f26653f","Type":"ContainerStarted","Data":"b807a1c23dc3b78a2932a9d12cb33a247ec78fbd973ff078dc74c3646076ffb2"} Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.480712 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-59f4fbb654-hrpnd"] Dec 09 03:35:10 crc kubenswrapper[4766]: W1209 03:35:10.587858 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ed2e90b_2a3a_4a15_92f5_63ae6ebde4a8.slice/crio-03264b415a1fd2ba57e5b8fc00e4b43bf80efaa0514f575145264f1945eb003c WatchSource:0}: Error finding container 03264b415a1fd2ba57e5b8fc00e4b43bf80efaa0514f575145264f1945eb003c: Status 404 returned error can't find the container with id 03264b415a1fd2ba57e5b8fc00e4b43bf80efaa0514f575145264f1945eb003c Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.589098 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-zxhf5"] Dec 09 03:35:10 crc kubenswrapper[4766]: I1209 03:35:10.675154 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-584546f8-9kq4r"] Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.438819 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-584546f8-9kq4r" event={"ID":"dfb02f25-f0c2-460b-99f7-8ae10815c016","Type":"ContainerStarted","Data":"a370253f7967af61b1e3e1e4e10df5f0a1204b69e552201b60fd8483dd92832a"} Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.439091 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.439101 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-584546f8-9kq4r" event={"ID":"dfb02f25-f0c2-460b-99f7-8ae10815c016","Type":"ContainerStarted","Data":"2943e23fe2169c9438985840eb74c4ce41acda21f15bdd14f69e1c6b640843d8"} Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.439110 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-584546f8-9kq4r" event={"ID":"dfb02f25-f0c2-460b-99f7-8ae10815c016","Type":"ContainerStarted","Data":"a21a6f6aa0e21dc3933b74e0a8a3a6390e83401c1477c59d9688aba482813561"} Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.439119 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.442740 4766 generic.go:334] "Generic (PLEG): container finished" podID="9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" containerID="1e19e8939ccf2b634db5afdb6bea4bb55aaae63d314fa98c05a6488b6ad75ea8" exitCode=0 Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.442796 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" event={"ID":"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8","Type":"ContainerDied","Data":"1e19e8939ccf2b634db5afdb6bea4bb55aaae63d314fa98c05a6488b6ad75ea8"} Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.442819 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" event={"ID":"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8","Type":"ContainerStarted","Data":"03264b415a1fd2ba57e5b8fc00e4b43bf80efaa0514f575145264f1945eb003c"} Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.445132 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" event={"ID":"7ae0a18c-f118-45b5-8989-9ca3a49827ad","Type":"ContainerStarted","Data":"f108bb438c724162f0547e30e4efd9a9875f3b030db705d3b3ecd584c0eb0426"} Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.450350 4766 generic.go:334] "Generic (PLEG): container finished" podID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerID="461930672d3ab54f27251bcecc76b609ffc06aafab7d479ace95b7d2ebe6d090" exitCode=0 Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.450384 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerDied","Data":"461930672d3ab54f27251bcecc76b609ffc06aafab7d479ace95b7d2ebe6d090"} Dec 09 03:35:11 crc kubenswrapper[4766]: I1209 03:35:11.455602 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-584546f8-9kq4r" podStartSLOduration=2.455588228 podStartE2EDuration="2.455588228s" podCreationTimestamp="2025-12-09 03:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:11.453784249 +0000 UTC m=+1393.163089675" watchObservedRunningTime="2025-12-09 03:35:11.455588228 +0000 UTC m=+1393.164893654" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.019809 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.165960 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-scripts\") pod \"ce20bb03-595e-4809-8bdc-77a8072c15e7\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.166849 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-run-httpd\") pod \"ce20bb03-595e-4809-8bdc-77a8072c15e7\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.166902 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp9rn\" (UniqueName: \"kubernetes.io/projected/ce20bb03-595e-4809-8bdc-77a8072c15e7-kube-api-access-pp9rn\") pod \"ce20bb03-595e-4809-8bdc-77a8072c15e7\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.167085 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ce20bb03-595e-4809-8bdc-77a8072c15e7" (UID: "ce20bb03-595e-4809-8bdc-77a8072c15e7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.167198 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-combined-ca-bundle\") pod \"ce20bb03-595e-4809-8bdc-77a8072c15e7\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.167545 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-config-data\") pod \"ce20bb03-595e-4809-8bdc-77a8072c15e7\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.167622 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-log-httpd\") pod \"ce20bb03-595e-4809-8bdc-77a8072c15e7\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.167980 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-sg-core-conf-yaml\") pod \"ce20bb03-595e-4809-8bdc-77a8072c15e7\" (UID: \"ce20bb03-595e-4809-8bdc-77a8072c15e7\") " Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.167915 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ce20bb03-595e-4809-8bdc-77a8072c15e7" (UID: "ce20bb03-595e-4809-8bdc-77a8072c15e7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.169069 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.169110 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce20bb03-595e-4809-8bdc-77a8072c15e7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.175839 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce20bb03-595e-4809-8bdc-77a8072c15e7-kube-api-access-pp9rn" (OuterVolumeSpecName: "kube-api-access-pp9rn") pod "ce20bb03-595e-4809-8bdc-77a8072c15e7" (UID: "ce20bb03-595e-4809-8bdc-77a8072c15e7"). InnerVolumeSpecName "kube-api-access-pp9rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.175967 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-scripts" (OuterVolumeSpecName: "scripts") pod "ce20bb03-595e-4809-8bdc-77a8072c15e7" (UID: "ce20bb03-595e-4809-8bdc-77a8072c15e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.201154 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ce20bb03-595e-4809-8bdc-77a8072c15e7" (UID: "ce20bb03-595e-4809-8bdc-77a8072c15e7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.270416 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp9rn\" (UniqueName: \"kubernetes.io/projected/ce20bb03-595e-4809-8bdc-77a8072c15e7-kube-api-access-pp9rn\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.270447 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.270456 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.303451 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce20bb03-595e-4809-8bdc-77a8072c15e7" (UID: "ce20bb03-595e-4809-8bdc-77a8072c15e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.318570 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b9bc9ddf8-vdj7m"] Dec 09 03:35:12 crc kubenswrapper[4766]: E1209 03:35:12.319019 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="sg-core" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.319037 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="sg-core" Dec 09 03:35:12 crc kubenswrapper[4766]: E1209 03:35:12.319064 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="ceilometer-notification-agent" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.319072 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="ceilometer-notification-agent" Dec 09 03:35:12 crc kubenswrapper[4766]: E1209 03:35:12.319115 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="proxy-httpd" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.319126 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="proxy-httpd" Dec 09 03:35:12 crc kubenswrapper[4766]: E1209 03:35:12.319142 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="ceilometer-central-agent" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.319150 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="ceilometer-central-agent" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.319358 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="ceilometer-central-agent" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.319373 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="ceilometer-notification-agent" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.319398 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="sg-core" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.319414 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" containerName="proxy-httpd" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.320529 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.323507 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.323678 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.333625 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-config-data" (OuterVolumeSpecName: "config-data") pod "ce20bb03-595e-4809-8bdc-77a8072c15e7" (UID: "ce20bb03-595e-4809-8bdc-77a8072c15e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.335505 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b9bc9ddf8-vdj7m"] Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.371349 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5edf46d6-e570-425b-843d-d67f5adde599-logs\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.371601 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data-custom\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.371718 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-public-tls-certs\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.371826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-combined-ca-bundle\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.371939 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldc6d\" (UniqueName: \"kubernetes.io/projected/5edf46d6-e570-425b-843d-d67f5adde599-kube-api-access-ldc6d\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.372103 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-internal-tls-certs\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.372243 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.372524 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.372630 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce20bb03-595e-4809-8bdc-77a8072c15e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.463756 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" event={"ID":"04522868-a66d-44f8-a9bb-6f157f26653f","Type":"ContainerStarted","Data":"1f7f05c852cc98494db502851dafc9d9e8f7eb88e225048ac88ebdb63d25b528"} Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.470353 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" event={"ID":"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8","Type":"ContainerStarted","Data":"1d46ce8d257b80d4a4e27654d912e2d51bed7b3e943ae0d9d0afac1cc916f758"} Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.470532 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.474513 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce20bb03-595e-4809-8bdc-77a8072c15e7","Type":"ContainerDied","Data":"819aa22194b46f4e5d5a170c7b8e78bae8b5f02bb4dbcc9dc57e7617b0c0081a"} Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.474535 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.474564 4766 scope.go:117] "RemoveContainer" containerID="bcae299d720102c7510509f79675b9eb46193c5f57ef3edc5f368e610f290e4c" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.474789 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5edf46d6-e570-425b-843d-d67f5adde599-logs\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.474828 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data-custom\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.474896 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-public-tls-certs\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.474926 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-combined-ca-bundle\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.474988 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldc6d\" (UniqueName: \"kubernetes.io/projected/5edf46d6-e570-425b-843d-d67f5adde599-kube-api-access-ldc6d\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.475073 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-internal-tls-certs\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.475100 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.475373 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5edf46d6-e570-425b-843d-d67f5adde599-logs\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.479590 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-internal-tls-certs\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.480191 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-combined-ca-bundle\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.481228 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data-custom\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.481870 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.482528 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-public-tls-certs\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.492158 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" podStartSLOduration=3.492138986 podStartE2EDuration="3.492138986s" podCreationTimestamp="2025-12-09 03:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:12.485632541 +0000 UTC m=+1394.194937967" watchObservedRunningTime="2025-12-09 03:35:12.492138986 +0000 UTC m=+1394.201444412" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.495089 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldc6d\" (UniqueName: \"kubernetes.io/projected/5edf46d6-e570-425b-843d-d67f5adde599-kube-api-access-ldc6d\") pod \"barbican-api-7b9bc9ddf8-vdj7m\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.527765 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.572647 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.595861 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.598129 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.601159 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.602488 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.607387 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.680150 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.680249 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-scripts\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.680289 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-config-data\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.680326 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9bp5\" (UniqueName: \"kubernetes.io/projected/1fb3049a-4c91-4695-bf48-303308399ccc-kube-api-access-q9bp5\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.680436 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-run-httpd\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.680496 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.680527 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-log-httpd\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.714865 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.782147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.782373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-scripts\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.782417 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-config-data\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.782454 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9bp5\" (UniqueName: \"kubernetes.io/projected/1fb3049a-4c91-4695-bf48-303308399ccc-kube-api-access-q9bp5\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.782525 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-run-httpd\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.782571 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.782599 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-log-httpd\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.783096 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-log-httpd\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.783186 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-run-httpd\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.786945 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.787493 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-scripts\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.787843 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.790131 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-config-data\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.800841 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9bp5\" (UniqueName: \"kubernetes.io/projected/1fb3049a-4c91-4695-bf48-303308399ccc-kube-api-access-q9bp5\") pod \"ceilometer-0\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " pod="openstack/ceilometer-0" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.851277 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce20bb03-595e-4809-8bdc-77a8072c15e7" path="/var/lib/kubelet/pods/ce20bb03-595e-4809-8bdc-77a8072c15e7/volumes" Dec 09 03:35:12 crc kubenswrapper[4766]: I1209 03:35:12.923982 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:35:13 crc kubenswrapper[4766]: I1209 03:35:13.045341 4766 scope.go:117] "RemoveContainer" containerID="dae354cbcd90e6ab1b02efec63ffdd4765c865493e500918a2955fe6bf02dd05" Dec 09 03:35:13 crc kubenswrapper[4766]: I1209 03:35:13.120713 4766 scope.go:117] "RemoveContainer" containerID="461930672d3ab54f27251bcecc76b609ffc06aafab7d479ace95b7d2ebe6d090" Dec 09 03:35:13 crc kubenswrapper[4766]: I1209 03:35:13.149016 4766 scope.go:117] "RemoveContainer" containerID="690af035ecf71c8360e71ed1cfe8a13117268672add00219b1815b92e5272e1d" Dec 09 03:35:13 crc kubenswrapper[4766]: I1209 03:35:13.483565 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" event={"ID":"7ae0a18c-f118-45b5-8989-9ca3a49827ad","Type":"ContainerStarted","Data":"0064b87b123267ac3d50f8f1784bd6b3c893066418c3670a19b10112ba20b3b9"} Dec 09 03:35:13 crc kubenswrapper[4766]: I1209 03:35:13.487923 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" event={"ID":"04522868-a66d-44f8-a9bb-6f157f26653f","Type":"ContainerStarted","Data":"6344e7369a853e77c3393dd6d7339e8c504e823cd554068d45409f77989aba7d"} Dec 09 03:35:13 crc kubenswrapper[4766]: I1209 03:35:13.507459 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" podStartSLOduration=2.854438073 podStartE2EDuration="4.507440823s" podCreationTimestamp="2025-12-09 03:35:09 +0000 UTC" firstStartedPulling="2025-12-09 03:35:10.241965754 +0000 UTC m=+1391.951271180" lastFinishedPulling="2025-12-09 03:35:11.894968504 +0000 UTC m=+1393.604273930" observedRunningTime="2025-12-09 03:35:13.50401703 +0000 UTC m=+1395.213322456" watchObservedRunningTime="2025-12-09 03:35:13.507440823 +0000 UTC m=+1395.216746249" Dec 09 03:35:13 crc kubenswrapper[4766]: I1209 03:35:13.587969 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:13 crc kubenswrapper[4766]: I1209 03:35:13.660093 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b9bc9ddf8-vdj7m"] Dec 09 03:35:13 crc kubenswrapper[4766]: W1209 03:35:13.660397 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5edf46d6_e570_425b_843d_d67f5adde599.slice/crio-44e73d47bd86214788c3cb214c9d832d8de34b3ea0cbb48ee72f233434d5cc4e WatchSource:0}: Error finding container 44e73d47bd86214788c3cb214c9d832d8de34b3ea0cbb48ee72f233434d5cc4e: Status 404 returned error can't find the container with id 44e73d47bd86214788c3cb214c9d832d8de34b3ea0cbb48ee72f233434d5cc4e Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.501160 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerStarted","Data":"ae00c50d3de713a5eb8356e5e5eff40b598fe44ecb96003e5577b4af3098f322"} Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.501200 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerStarted","Data":"dbb19d82afc22e867c96efa180cf9e791f9c0deab84af51f05e1906492a9caae"} Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.503375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" event={"ID":"5edf46d6-e570-425b-843d-d67f5adde599","Type":"ContainerStarted","Data":"0e789a722d40821da0b1865e918a8aa00c5015bf08b60022e5a34cfa2cf5a712"} Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.503398 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" event={"ID":"5edf46d6-e570-425b-843d-d67f5adde599","Type":"ContainerStarted","Data":"7f712ce3fa09ea2b5ba75d7f96299ac239eca89fd085f13671911969bcb7a465"} Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.503409 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" event={"ID":"5edf46d6-e570-425b-843d-d67f5adde599","Type":"ContainerStarted","Data":"44e73d47bd86214788c3cb214c9d832d8de34b3ea0cbb48ee72f233434d5cc4e"} Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.504142 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.504187 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.506556 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" event={"ID":"7ae0a18c-f118-45b5-8989-9ca3a49827ad","Type":"ContainerStarted","Data":"6d7c92439814de4f38c3b7f907335c896ebf15dd2cc3e1bf15669b628a996f74"} Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.531111 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" podStartSLOduration=2.531088694 podStartE2EDuration="2.531088694s" podCreationTimestamp="2025-12-09 03:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:14.525416461 +0000 UTC m=+1396.234721887" watchObservedRunningTime="2025-12-09 03:35:14.531088694 +0000 UTC m=+1396.240394120" Dec 09 03:35:14 crc kubenswrapper[4766]: I1209 03:35:14.558337 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" podStartSLOduration=2.921401927 podStartE2EDuration="5.558312067s" podCreationTimestamp="2025-12-09 03:35:09 +0000 UTC" firstStartedPulling="2025-12-09 03:35:10.496439664 +0000 UTC m=+1392.205745090" lastFinishedPulling="2025-12-09 03:35:13.133349814 +0000 UTC m=+1394.842655230" observedRunningTime="2025-12-09 03:35:14.543897669 +0000 UTC m=+1396.253203105" watchObservedRunningTime="2025-12-09 03:35:14.558312067 +0000 UTC m=+1396.267617493" Dec 09 03:35:15 crc kubenswrapper[4766]: I1209 03:35:15.520452 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerStarted","Data":"eb8a622a7ab97514c618cf772bac6f12bb0184f2ca3e77b1b88175ba51327944"} Dec 09 03:35:15 crc kubenswrapper[4766]: I1209 03:35:15.524321 4766 generic.go:334] "Generic (PLEG): container finished" podID="4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" containerID="ea2d873adb8b8f0bf3c7da1db055698c2ecd4add0b416c79010a80d08ed56eb9" exitCode=0 Dec 09 03:35:15 crc kubenswrapper[4766]: I1209 03:35:15.524402 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-brzmt" event={"ID":"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc","Type":"ContainerDied","Data":"ea2d873adb8b8f0bf3c7da1db055698c2ecd4add0b416c79010a80d08ed56eb9"} Dec 09 03:35:16 crc kubenswrapper[4766]: I1209 03:35:16.535046 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerStarted","Data":"7dec08f04cb360b6f65de06cb26d35c89057423b01725a696a95038fbe46625c"} Dec 09 03:35:16 crc kubenswrapper[4766]: I1209 03:35:16.735748 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:16 crc kubenswrapper[4766]: I1209 03:35:16.927389 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-brzmt" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.062715 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-scripts\") pod \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.063152 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-config-data\") pod \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.063232 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-etc-machine-id\") pod \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.063286 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf6bf\" (UniqueName: \"kubernetes.io/projected/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-kube-api-access-vf6bf\") pod \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.063317 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-combined-ca-bundle\") pod \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.063393 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-db-sync-config-data\") pod \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\" (UID: \"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc\") " Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.064016 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" (UID: "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.068788 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" (UID: "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.069011 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-kube-api-access-vf6bf" (OuterVolumeSpecName: "kube-api-access-vf6bf") pod "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" (UID: "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc"). InnerVolumeSpecName "kube-api-access-vf6bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.071381 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-scripts" (OuterVolumeSpecName: "scripts") pod "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" (UID: "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.091723 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" (UID: "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.111622 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-config-data" (OuterVolumeSpecName: "config-data") pod "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" (UID: "4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.167664 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.167744 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.167786 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.167797 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf6bf\" (UniqueName: \"kubernetes.io/projected/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-kube-api-access-vf6bf\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.167806 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.167814 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.545706 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerStarted","Data":"cfdd4a4572d158e42a537050a3198fb1b562beaa7b955d01d2bf967feb6d518e"} Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.546505 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.550122 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-brzmt" event={"ID":"4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc","Type":"ContainerDied","Data":"4be0abc05661091c0a340bbcdbbed57455a690b1f18ad003d19eab3fad2fb281"} Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.550151 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4be0abc05661091c0a340bbcdbbed57455a690b1f18ad003d19eab3fad2fb281" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.550258 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-brzmt" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.598618 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.406807419 podStartE2EDuration="5.598598554s" podCreationTimestamp="2025-12-09 03:35:12 +0000 UTC" firstStartedPulling="2025-12-09 03:35:13.585180725 +0000 UTC m=+1395.294486141" lastFinishedPulling="2025-12-09 03:35:16.77697185 +0000 UTC m=+1398.486277276" observedRunningTime="2025-12-09 03:35:17.575710108 +0000 UTC m=+1399.285015544" watchObservedRunningTime="2025-12-09 03:35:17.598598554 +0000 UTC m=+1399.307903980" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.847812 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:35:17 crc kubenswrapper[4766]: E1209 03:35:17.848325 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" containerName="cinder-db-sync" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.848352 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" containerName="cinder-db-sync" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.848612 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" containerName="cinder-db-sync" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.849883 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.851932 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.852189 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2vlnm" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.852422 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.852556 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.856107 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.956138 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-zxhf5"] Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.956377 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" podUID="9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" containerName="dnsmasq-dns" containerID="cri-o://1d46ce8d257b80d4a4e27654d912e2d51bed7b3e943ae0d9d0afac1cc916f758" gracePeriod=10 Dec 09 03:35:17 crc kubenswrapper[4766]: I1209 03:35:17.959276 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.005585 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.005894 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkgl2\" (UniqueName: \"kubernetes.io/projected/976f83ea-558d-45e2-afec-42a12ae6c8ec-kube-api-access-hkgl2\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.006630 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/976f83ea-558d-45e2-afec-42a12ae6c8ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.006765 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.006796 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.006864 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.018969 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ns6dc"] Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.021257 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.035813 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ns6dc"] Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.063954 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.065371 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.071532 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.105293 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110459 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110528 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcfnb\" (UniqueName: \"kubernetes.io/projected/5ee1455b-319a-4093-85ba-0b97e662ecf8-kube-api-access-xcfnb\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110553 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110607 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110632 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110652 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkgl2\" (UniqueName: \"kubernetes.io/projected/976f83ea-558d-45e2-afec-42a12ae6c8ec-kube-api-access-hkgl2\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110673 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110692 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-config\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110720 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/976f83ea-558d-45e2-afec-42a12ae6c8ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110734 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110760 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-svc\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.110805 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.114308 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/976f83ea-558d-45e2-afec-42a12ae6c8ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.120109 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.120450 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.123449 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.124834 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.137595 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkgl2\" (UniqueName: \"kubernetes.io/projected/976f83ea-558d-45e2-afec-42a12ae6c8ec-kube-api-access-hkgl2\") pod \"cinder-scheduler-0\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.201810 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212131 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/632db043-21e1-4275-9956-53620e994ced-logs\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212176 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjgmj\" (UniqueName: \"kubernetes.io/projected/632db043-21e1-4275-9956-53620e994ced-kube-api-access-hjgmj\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212292 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212349 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212375 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-config\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212416 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212440 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212465 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/632db043-21e1-4275-9956-53620e994ced-etc-machine-id\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212495 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-scripts\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212521 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-svc\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212587 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212662 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data-custom\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.212695 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcfnb\" (UniqueName: \"kubernetes.io/projected/5ee1455b-319a-4093-85ba-0b97e662ecf8-kube-api-access-xcfnb\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.213578 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.213911 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.214251 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-config\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.214539 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.216858 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-svc\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.239705 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcfnb\" (UniqueName: \"kubernetes.io/projected/5ee1455b-319a-4093-85ba-0b97e662ecf8-kube-api-access-xcfnb\") pod \"dnsmasq-dns-5784cf869f-ns6dc\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.305682 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.313730 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data-custom\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.313798 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/632db043-21e1-4275-9956-53620e994ced-logs\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.313817 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjgmj\" (UniqueName: \"kubernetes.io/projected/632db043-21e1-4275-9956-53620e994ced-kube-api-access-hjgmj\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.313882 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.313898 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/632db043-21e1-4275-9956-53620e994ced-etc-machine-id\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.313917 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-scripts\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.313961 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.316306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/632db043-21e1-4275-9956-53620e994ced-etc-machine-id\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.316597 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/632db043-21e1-4275-9956-53620e994ced-logs\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.317478 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data-custom\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.319299 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.319529 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-scripts\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.321474 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.335240 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjgmj\" (UniqueName: \"kubernetes.io/projected/632db043-21e1-4275-9956-53620e994ced-kube-api-access-hjgmj\") pod \"cinder-api-0\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.365028 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.519139 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.565301 4766 generic.go:334] "Generic (PLEG): container finished" podID="9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" containerID="1d46ce8d257b80d4a4e27654d912e2d51bed7b3e943ae0d9d0afac1cc916f758" exitCode=0 Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.565408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" event={"ID":"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8","Type":"ContainerDied","Data":"1d46ce8d257b80d4a4e27654d912e2d51bed7b3e943ae0d9d0afac1cc916f758"} Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.565459 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" event={"ID":"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8","Type":"ContainerDied","Data":"03264b415a1fd2ba57e5b8fc00e4b43bf80efaa0514f575145264f1945eb003c"} Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.565496 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03264b415a1fd2ba57e5b8fc00e4b43bf80efaa0514f575145264f1945eb003c" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.578910 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.722543 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64mf8\" (UniqueName: \"kubernetes.io/projected/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-kube-api-access-64mf8\") pod \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.722632 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-svc\") pod \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.722681 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-swift-storage-0\") pod \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.722699 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-config\") pod \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.722811 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-nb\") pod \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.722868 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-sb\") pod \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\" (UID: \"9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8\") " Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.729177 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-kube-api-access-64mf8" (OuterVolumeSpecName: "kube-api-access-64mf8") pod "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" (UID: "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8"). InnerVolumeSpecName "kube-api-access-64mf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.730266 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64mf8\" (UniqueName: \"kubernetes.io/projected/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-kube-api-access-64mf8\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.785151 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-config" (OuterVolumeSpecName: "config") pod "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" (UID: "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.795068 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" (UID: "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.800528 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" (UID: "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.821247 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" (UID: "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.831676 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.831755 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.831773 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.831789 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.832961 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" (UID: "9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.879841 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.932952 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:18 crc kubenswrapper[4766]: I1209 03:35:18.994451 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ns6dc"] Dec 09 03:35:19 crc kubenswrapper[4766]: W1209 03:35:19.006351 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ee1455b_319a_4093_85ba_0b97e662ecf8.slice/crio-dfe098adacb25ec4d068534d1058cf434692b493fba842b9e9c9d55845e39a8c WatchSource:0}: Error finding container dfe098adacb25ec4d068534d1058cf434692b493fba842b9e9c9d55845e39a8c: Status 404 returned error can't find the container with id dfe098adacb25ec4d068534d1058cf434692b493fba842b9e9c9d55845e39a8c Dec 09 03:35:19 crc kubenswrapper[4766]: I1209 03:35:19.143590 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:35:19 crc kubenswrapper[4766]: I1209 03:35:19.588522 4766 generic.go:334] "Generic (PLEG): container finished" podID="5ee1455b-319a-4093-85ba-0b97e662ecf8" containerID="5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7" exitCode=0 Dec 09 03:35:19 crc kubenswrapper[4766]: I1209 03:35:19.588598 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" event={"ID":"5ee1455b-319a-4093-85ba-0b97e662ecf8","Type":"ContainerDied","Data":"5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7"} Dec 09 03:35:19 crc kubenswrapper[4766]: I1209 03:35:19.588628 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" event={"ID":"5ee1455b-319a-4093-85ba-0b97e662ecf8","Type":"ContainerStarted","Data":"dfe098adacb25ec4d068534d1058cf434692b493fba842b9e9c9d55845e39a8c"} Dec 09 03:35:19 crc kubenswrapper[4766]: I1209 03:35:19.592296 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"976f83ea-558d-45e2-afec-42a12ae6c8ec","Type":"ContainerStarted","Data":"f25abdee22f12253f223570ce2d19788f97b6a76cb798f3306ce97520e495909"} Dec 09 03:35:19 crc kubenswrapper[4766]: I1209 03:35:19.594028 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-zxhf5" Dec 09 03:35:19 crc kubenswrapper[4766]: I1209 03:35:19.594190 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"632db043-21e1-4275-9956-53620e994ced","Type":"ContainerStarted","Data":"ce074cd62369c813b13de1f45317b0ed2b539e2dc585b2b9086c30f6acd85976"} Dec 09 03:35:19 crc kubenswrapper[4766]: I1209 03:35:19.633045 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-zxhf5"] Dec 09 03:35:19 crc kubenswrapper[4766]: I1209 03:35:19.643165 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-zxhf5"] Dec 09 03:35:20 crc kubenswrapper[4766]: I1209 03:35:20.540011 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:35:20 crc kubenswrapper[4766]: I1209 03:35:20.628954 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"632db043-21e1-4275-9956-53620e994ced","Type":"ContainerStarted","Data":"81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2"} Dec 09 03:35:20 crc kubenswrapper[4766]: I1209 03:35:20.643989 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" event={"ID":"5ee1455b-319a-4093-85ba-0b97e662ecf8","Type":"ContainerStarted","Data":"37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f"} Dec 09 03:35:20 crc kubenswrapper[4766]: I1209 03:35:20.645381 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:20 crc kubenswrapper[4766]: I1209 03:35:20.665606 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"976f83ea-558d-45e2-afec-42a12ae6c8ec","Type":"ContainerStarted","Data":"5890cc79cfa4669e7500bf99a1a8eae4ff0f229f2f447b5f5e7132a76de6180b"} Dec 09 03:35:20 crc kubenswrapper[4766]: I1209 03:35:20.721787 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" podStartSLOduration=3.721759602 podStartE2EDuration="3.721759602s" podCreationTimestamp="2025-12-09 03:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:20.700194071 +0000 UTC m=+1402.409499497" watchObservedRunningTime="2025-12-09 03:35:20.721759602 +0000 UTC m=+1402.431065028" Dec 09 03:35:20 crc kubenswrapper[4766]: I1209 03:35:20.864294 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" path="/var/lib/kubelet/pods/9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8/volumes" Dec 09 03:35:21 crc kubenswrapper[4766]: I1209 03:35:21.686639 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"976f83ea-558d-45e2-afec-42a12ae6c8ec","Type":"ContainerStarted","Data":"21382bc634c266a6b0511875b9d323af3e34f39ab8c2a97bce3d7eea290f9e9d"} Dec 09 03:35:21 crc kubenswrapper[4766]: I1209 03:35:21.691621 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"632db043-21e1-4275-9956-53620e994ced","Type":"ContainerStarted","Data":"120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939"} Dec 09 03:35:21 crc kubenswrapper[4766]: I1209 03:35:21.691621 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="632db043-21e1-4275-9956-53620e994ced" containerName="cinder-api-log" containerID="cri-o://81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2" gracePeriod=30 Dec 09 03:35:21 crc kubenswrapper[4766]: I1209 03:35:21.691738 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 03:35:21 crc kubenswrapper[4766]: I1209 03:35:21.691775 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="632db043-21e1-4275-9956-53620e994ced" containerName="cinder-api" containerID="cri-o://120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939" gracePeriod=30 Dec 09 03:35:21 crc kubenswrapper[4766]: I1209 03:35:21.710747 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.893213456 podStartE2EDuration="4.710727659s" podCreationTimestamp="2025-12-09 03:35:17 +0000 UTC" firstStartedPulling="2025-12-09 03:35:18.830461199 +0000 UTC m=+1400.539766625" lastFinishedPulling="2025-12-09 03:35:19.647975402 +0000 UTC m=+1401.357280828" observedRunningTime="2025-12-09 03:35:21.709892037 +0000 UTC m=+1403.419197463" watchObservedRunningTime="2025-12-09 03:35:21.710727659 +0000 UTC m=+1403.420033085" Dec 09 03:35:21 crc kubenswrapper[4766]: I1209 03:35:21.737333 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.737297694 podStartE2EDuration="3.737297694s" podCreationTimestamp="2025-12-09 03:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:21.72783976 +0000 UTC m=+1403.437145196" watchObservedRunningTime="2025-12-09 03:35:21.737297694 +0000 UTC m=+1403.446603120" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.345045 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.431537 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-scripts\") pod \"632db043-21e1-4275-9956-53620e994ced\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.431668 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjgmj\" (UniqueName: \"kubernetes.io/projected/632db043-21e1-4275-9956-53620e994ced-kube-api-access-hjgmj\") pod \"632db043-21e1-4275-9956-53620e994ced\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.431745 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-combined-ca-bundle\") pod \"632db043-21e1-4275-9956-53620e994ced\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.431879 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/632db043-21e1-4275-9956-53620e994ced-etc-machine-id\") pod \"632db043-21e1-4275-9956-53620e994ced\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.431920 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data-custom\") pod \"632db043-21e1-4275-9956-53620e994ced\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.431964 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data\") pod \"632db043-21e1-4275-9956-53620e994ced\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.431995 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/632db043-21e1-4275-9956-53620e994ced-logs\") pod \"632db043-21e1-4275-9956-53620e994ced\" (UID: \"632db043-21e1-4275-9956-53620e994ced\") " Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.432400 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/632db043-21e1-4275-9956-53620e994ced-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "632db043-21e1-4275-9956-53620e994ced" (UID: "632db043-21e1-4275-9956-53620e994ced"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.433133 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/632db043-21e1-4275-9956-53620e994ced-logs" (OuterVolumeSpecName: "logs") pod "632db043-21e1-4275-9956-53620e994ced" (UID: "632db043-21e1-4275-9956-53620e994ced"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.438351 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632db043-21e1-4275-9956-53620e994ced-kube-api-access-hjgmj" (OuterVolumeSpecName: "kube-api-access-hjgmj") pod "632db043-21e1-4275-9956-53620e994ced" (UID: "632db043-21e1-4275-9956-53620e994ced"). InnerVolumeSpecName "kube-api-access-hjgmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.438774 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "632db043-21e1-4275-9956-53620e994ced" (UID: "632db043-21e1-4275-9956-53620e994ced"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.439446 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-scripts" (OuterVolumeSpecName: "scripts") pod "632db043-21e1-4275-9956-53620e994ced" (UID: "632db043-21e1-4275-9956-53620e994ced"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.479955 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "632db043-21e1-4275-9956-53620e994ced" (UID: "632db043-21e1-4275-9956-53620e994ced"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.509074 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data" (OuterVolumeSpecName: "config-data") pod "632db043-21e1-4275-9956-53620e994ced" (UID: "632db043-21e1-4275-9956-53620e994ced"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.534485 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjgmj\" (UniqueName: \"kubernetes.io/projected/632db043-21e1-4275-9956-53620e994ced-kube-api-access-hjgmj\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.534515 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.534525 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/632db043-21e1-4275-9956-53620e994ced-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.534533 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.534542 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.534550 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/632db043-21e1-4275-9956-53620e994ced-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.534558 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/632db043-21e1-4275-9956-53620e994ced-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.627113 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.701757 4766 generic.go:334] "Generic (PLEG): container finished" podID="632db043-21e1-4275-9956-53620e994ced" containerID="120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939" exitCode=0 Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.702944 4766 generic.go:334] "Generic (PLEG): container finished" podID="632db043-21e1-4275-9956-53620e994ced" containerID="81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2" exitCode=143 Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.701839 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"632db043-21e1-4275-9956-53620e994ced","Type":"ContainerDied","Data":"120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939"} Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.703378 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"632db043-21e1-4275-9956-53620e994ced","Type":"ContainerDied","Data":"81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2"} Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.703405 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"632db043-21e1-4275-9956-53620e994ced","Type":"ContainerDied","Data":"ce074cd62369c813b13de1f45317b0ed2b539e2dc585b2b9086c30f6acd85976"} Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.703424 4766 scope.go:117] "RemoveContainer" containerID="120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.701805 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.755344 4766 scope.go:117] "RemoveContainer" containerID="81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.756191 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.797187 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.808281 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:35:22 crc kubenswrapper[4766]: E1209 03:35:22.808708 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632db043-21e1-4275-9956-53620e994ced" containerName="cinder-api-log" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.808727 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="632db043-21e1-4275-9956-53620e994ced" containerName="cinder-api-log" Dec 09 03:35:22 crc kubenswrapper[4766]: E1209 03:35:22.808737 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632db043-21e1-4275-9956-53620e994ced" containerName="cinder-api" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.808744 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="632db043-21e1-4275-9956-53620e994ced" containerName="cinder-api" Dec 09 03:35:22 crc kubenswrapper[4766]: E1209 03:35:22.808788 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" containerName="init" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.808795 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" containerName="init" Dec 09 03:35:22 crc kubenswrapper[4766]: E1209 03:35:22.808804 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" containerName="dnsmasq-dns" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.808811 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" containerName="dnsmasq-dns" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.808974 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="632db043-21e1-4275-9956-53620e994ced" containerName="cinder-api-log" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.808995 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="632db043-21e1-4275-9956-53620e994ced" containerName="cinder-api" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.809008 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed2e90b-2a3a-4a15-92f5-63ae6ebde4a8" containerName="dnsmasq-dns" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.810191 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.817481 4766 scope.go:117] "RemoveContainer" containerID="120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.817884 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.818075 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.818277 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.819436 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:35:22 crc kubenswrapper[4766]: E1209 03:35:22.819623 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939\": container with ID starting with 120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939 not found: ID does not exist" containerID="120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.819673 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939"} err="failed to get container status \"120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939\": rpc error: code = NotFound desc = could not find container \"120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939\": container with ID starting with 120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939 not found: ID does not exist" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.819702 4766 scope.go:117] "RemoveContainer" containerID="81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2" Dec 09 03:35:22 crc kubenswrapper[4766]: E1209 03:35:22.824367 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2\": container with ID starting with 81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2 not found: ID does not exist" containerID="81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.824414 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2"} err="failed to get container status \"81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2\": rpc error: code = NotFound desc = could not find container \"81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2\": container with ID starting with 81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2 not found: ID does not exist" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.824441 4766 scope.go:117] "RemoveContainer" containerID="120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.830289 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939"} err="failed to get container status \"120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939\": rpc error: code = NotFound desc = could not find container \"120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939\": container with ID starting with 120361cc5d61350eb8dc8d7800b077e70554a1b52803f5412e3936616266e939 not found: ID does not exist" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.830326 4766 scope.go:117] "RemoveContainer" containerID="81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.832355 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2"} err="failed to get container status \"81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2\": rpc error: code = NotFound desc = could not find container \"81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2\": container with ID starting with 81d4473bc42a8a9b35a683310d9b77f252cdb8a86b59eca00f9b0b31682dc2e2 not found: ID does not exist" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.861699 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632db043-21e1-4275-9956-53620e994ced" path="/var/lib/kubelet/pods/632db043-21e1-4275-9956-53620e994ced/volumes" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.957965 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.958047 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-scripts\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.958107 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47770faa-9973-4d81-a630-8c344bcd7b94-logs\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.958157 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data-custom\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.958196 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47770faa-9973-4d81-a630-8c344bcd7b94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.958249 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-public-tls-certs\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.958325 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.958447 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:22 crc kubenswrapper[4766]: I1209 03:35:22.958553 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxfn\" (UniqueName: \"kubernetes.io/projected/47770faa-9973-4d81-a630-8c344bcd7b94-kube-api-access-mhxfn\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.060508 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.060575 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-scripts\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.060623 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47770faa-9973-4d81-a630-8c344bcd7b94-logs\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.060670 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data-custom\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.060710 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47770faa-9973-4d81-a630-8c344bcd7b94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.060741 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-public-tls-certs\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.060783 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.060818 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.060852 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxfn\" (UniqueName: \"kubernetes.io/projected/47770faa-9973-4d81-a630-8c344bcd7b94-kube-api-access-mhxfn\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.061196 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47770faa-9973-4d81-a630-8c344bcd7b94-logs\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.061602 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47770faa-9973-4d81-a630-8c344bcd7b94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.064984 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.065431 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-scripts\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.065692 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data-custom\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.066742 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-public-tls-certs\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.066750 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.069309 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.082950 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxfn\" (UniqueName: \"kubernetes.io/projected/47770faa-9973-4d81-a630-8c344bcd7b94-kube-api-access-mhxfn\") pod \"cinder-api-0\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.173550 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.203125 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.638877 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:35:23 crc kubenswrapper[4766]: I1209 03:35:23.717825 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"47770faa-9973-4d81-a630-8c344bcd7b94","Type":"ContainerStarted","Data":"21f2a02a7d923e4ce0cc1cbd8be6525d1a84e7484dd0587a2325c4d72ba7ab68"} Dec 09 03:35:24 crc kubenswrapper[4766]: I1209 03:35:24.352848 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:24 crc kubenswrapper[4766]: I1209 03:35:24.355221 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:35:24 crc kubenswrapper[4766]: I1209 03:35:24.461567 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-584546f8-9kq4r"] Dec 09 03:35:24 crc kubenswrapper[4766]: I1209 03:35:24.461848 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-584546f8-9kq4r" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api-log" containerID="cri-o://2943e23fe2169c9438985840eb74c4ce41acda21f15bdd14f69e1c6b640843d8" gracePeriod=30 Dec 09 03:35:24 crc kubenswrapper[4766]: I1209 03:35:24.462401 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-584546f8-9kq4r" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api" containerID="cri-o://a370253f7967af61b1e3e1e4e10df5f0a1204b69e552201b60fd8483dd92832a" gracePeriod=30 Dec 09 03:35:24 crc kubenswrapper[4766]: I1209 03:35:24.730380 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"47770faa-9973-4d81-a630-8c344bcd7b94","Type":"ContainerStarted","Data":"fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67"} Dec 09 03:35:24 crc kubenswrapper[4766]: I1209 03:35:24.732093 4766 generic.go:334] "Generic (PLEG): container finished" podID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerID="2943e23fe2169c9438985840eb74c4ce41acda21f15bdd14f69e1c6b640843d8" exitCode=143 Dec 09 03:35:24 crc kubenswrapper[4766]: I1209 03:35:24.732921 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-584546f8-9kq4r" event={"ID":"dfb02f25-f0c2-460b-99f7-8ae10815c016","Type":"ContainerDied","Data":"2943e23fe2169c9438985840eb74c4ce41acda21f15bdd14f69e1c6b640843d8"} Dec 09 03:35:25 crc kubenswrapper[4766]: I1209 03:35:25.164122 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:35:25 crc kubenswrapper[4766]: I1209 03:35:25.218745 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c7f64cdcb-c5qmq"] Dec 09 03:35:25 crc kubenswrapper[4766]: I1209 03:35:25.219165 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c7f64cdcb-c5qmq" podUID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerName="neutron-api" containerID="cri-o://79606849fff098e5f5d81ec4cb333c19fbcf0832408dee0e4a2616cb5d7c5c81" gracePeriod=30 Dec 09 03:35:25 crc kubenswrapper[4766]: I1209 03:35:25.219633 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c7f64cdcb-c5qmq" podUID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerName="neutron-httpd" containerID="cri-o://e166ebc5a4242345096ced25e89f65530d35ce71818720197b6d41e577df2045" gracePeriod=30 Dec 09 03:35:25 crc kubenswrapper[4766]: I1209 03:35:25.742918 4766 generic.go:334] "Generic (PLEG): container finished" podID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerID="e166ebc5a4242345096ced25e89f65530d35ce71818720197b6d41e577df2045" exitCode=0 Dec 09 03:35:25 crc kubenswrapper[4766]: I1209 03:35:25.743288 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c7f64cdcb-c5qmq" event={"ID":"852701bf-d4f9-4b3f-a2a0-76253feafe4f","Type":"ContainerDied","Data":"e166ebc5a4242345096ced25e89f65530d35ce71818720197b6d41e577df2045"} Dec 09 03:35:25 crc kubenswrapper[4766]: I1209 03:35:25.745866 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"47770faa-9973-4d81-a630-8c344bcd7b94","Type":"ContainerStarted","Data":"03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970"} Dec 09 03:35:25 crc kubenswrapper[4766]: I1209 03:35:25.747134 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 03:35:25 crc kubenswrapper[4766]: I1209 03:35:25.767083 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.767060343 podStartE2EDuration="3.767060343s" podCreationTimestamp="2025-12-09 03:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:25.766982831 +0000 UTC m=+1407.476288257" watchObservedRunningTime="2025-12-09 03:35:25.767060343 +0000 UTC m=+1407.476365769" Dec 09 03:35:27 crc kubenswrapper[4766]: I1209 03:35:27.618698 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-584546f8-9kq4r" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:45168->10.217.0.154:9311: read: connection reset by peer" Dec 09 03:35:27 crc kubenswrapper[4766]: I1209 03:35:27.618699 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-584546f8-9kq4r" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:45156->10.217.0.154:9311: read: connection reset by peer" Dec 09 03:35:27 crc kubenswrapper[4766]: I1209 03:35:27.778154 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:35:27 crc kubenswrapper[4766]: I1209 03:35:27.807028 4766 generic.go:334] "Generic (PLEG): container finished" podID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerID="a370253f7967af61b1e3e1e4e10df5f0a1204b69e552201b60fd8483dd92832a" exitCode=0 Dec 09 03:35:27 crc kubenswrapper[4766]: I1209 03:35:27.808356 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-584546f8-9kq4r" event={"ID":"dfb02f25-f0c2-460b-99f7-8ae10815c016","Type":"ContainerDied","Data":"a370253f7967af61b1e3e1e4e10df5f0a1204b69e552201b60fd8483dd92832a"} Dec 09 03:35:27 crc kubenswrapper[4766]: I1209 03:35:27.814749 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.121856 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.302823 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data\") pod \"dfb02f25-f0c2-460b-99f7-8ae10815c016\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.303241 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfb02f25-f0c2-460b-99f7-8ae10815c016-logs\") pod \"dfb02f25-f0c2-460b-99f7-8ae10815c016\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.303504 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-combined-ca-bundle\") pod \"dfb02f25-f0c2-460b-99f7-8ae10815c016\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.303619 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data-custom\") pod \"dfb02f25-f0c2-460b-99f7-8ae10815c016\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.303665 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzbgs\" (UniqueName: \"kubernetes.io/projected/dfb02f25-f0c2-460b-99f7-8ae10815c016-kube-api-access-tzbgs\") pod \"dfb02f25-f0c2-460b-99f7-8ae10815c016\" (UID: \"dfb02f25-f0c2-460b-99f7-8ae10815c016\") " Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.304940 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfb02f25-f0c2-460b-99f7-8ae10815c016-logs" (OuterVolumeSpecName: "logs") pod "dfb02f25-f0c2-460b-99f7-8ae10815c016" (UID: "dfb02f25-f0c2-460b-99f7-8ae10815c016"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.312189 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb02f25-f0c2-460b-99f7-8ae10815c016-kube-api-access-tzbgs" (OuterVolumeSpecName: "kube-api-access-tzbgs") pod "dfb02f25-f0c2-460b-99f7-8ae10815c016" (UID: "dfb02f25-f0c2-460b-99f7-8ae10815c016"). InnerVolumeSpecName "kube-api-access-tzbgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.317506 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dfb02f25-f0c2-460b-99f7-8ae10815c016" (UID: "dfb02f25-f0c2-460b-99f7-8ae10815c016"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.360664 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfb02f25-f0c2-460b-99f7-8ae10815c016" (UID: "dfb02f25-f0c2-460b-99f7-8ae10815c016"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.367465 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.389958 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data" (OuterVolumeSpecName: "config-data") pod "dfb02f25-f0c2-460b-99f7-8ae10815c016" (UID: "dfb02f25-f0c2-460b-99f7-8ae10815c016"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.406324 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.406355 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzbgs\" (UniqueName: \"kubernetes.io/projected/dfb02f25-f0c2-460b-99f7-8ae10815c016-kube-api-access-tzbgs\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.406367 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.406376 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfb02f25-f0c2-460b-99f7-8ae10815c016-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.406384 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb02f25-f0c2-460b-99f7-8ae10815c016-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.419165 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.472613 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rpjpr"] Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.473063 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" podUID="4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" containerName="dnsmasq-dns" containerID="cri-o://03be38dd0db64bfa4381e561ed96be54602652d0daaabe06c972e7098a6d9505" gracePeriod=10 Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.543030 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.823401 4766 generic.go:334] "Generic (PLEG): container finished" podID="4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" containerID="03be38dd0db64bfa4381e561ed96be54602652d0daaabe06c972e7098a6d9505" exitCode=0 Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.823431 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" event={"ID":"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987","Type":"ContainerDied","Data":"03be38dd0db64bfa4381e561ed96be54602652d0daaabe06c972e7098a6d9505"} Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.825521 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerName="cinder-scheduler" containerID="cri-o://5890cc79cfa4669e7500bf99a1a8eae4ff0f229f2f447b5f5e7132a76de6180b" gracePeriod=30 Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.825942 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-584546f8-9kq4r" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.828788 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerName="probe" containerID="cri-o://21382bc634c266a6b0511875b9d323af3e34f39ab8c2a97bce3d7eea290f9e9d" gracePeriod=30 Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.828829 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-584546f8-9kq4r" event={"ID":"dfb02f25-f0c2-460b-99f7-8ae10815c016","Type":"ContainerDied","Data":"a21a6f6aa0e21dc3933b74e0a8a3a6390e83401c1477c59d9688aba482813561"} Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.828875 4766 scope.go:117] "RemoveContainer" containerID="a370253f7967af61b1e3e1e4e10df5f0a1204b69e552201b60fd8483dd92832a" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.863562 4766 scope.go:117] "RemoveContainer" containerID="2943e23fe2169c9438985840eb74c4ce41acda21f15bdd14f69e1c6b640843d8" Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.870547 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-584546f8-9kq4r"] Dec 09 03:35:28 crc kubenswrapper[4766]: I1209 03:35:28.886187 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-584546f8-9kq4r"] Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.529956 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.633098 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-sb\") pod \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.633347 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-svc\") pod \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.633546 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-swift-storage-0\") pod \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.633600 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-nb\") pod \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.633620 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmpwn\" (UniqueName: \"kubernetes.io/projected/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-kube-api-access-qmpwn\") pod \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.633638 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-config\") pod \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\" (UID: \"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987\") " Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.656047 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-kube-api-access-qmpwn" (OuterVolumeSpecName: "kube-api-access-qmpwn") pod "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" (UID: "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987"). InnerVolumeSpecName "kube-api-access-qmpwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.693758 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" (UID: "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.694433 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" (UID: "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.697772 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" (UID: "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.720093 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-config" (OuterVolumeSpecName: "config") pod "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" (UID: "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.725275 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" (UID: "4e56f4d2-6f0c-4ae7-a166-ae1249d6f987"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.738108 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.738145 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.738160 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.738177 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.738188 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmpwn\" (UniqueName: \"kubernetes.io/projected/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-kube-api-access-qmpwn\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.738200 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.836271 4766 generic.go:334] "Generic (PLEG): container finished" podID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerID="21382bc634c266a6b0511875b9d323af3e34f39ab8c2a97bce3d7eea290f9e9d" exitCode=0 Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.836371 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"976f83ea-558d-45e2-afec-42a12ae6c8ec","Type":"ContainerDied","Data":"21382bc634c266a6b0511875b9d323af3e34f39ab8c2a97bce3d7eea290f9e9d"} Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.838109 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.838117 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-rpjpr" event={"ID":"4e56f4d2-6f0c-4ae7-a166-ae1249d6f987","Type":"ContainerDied","Data":"a8b97bac148e3be5931844a801ba200413f2d694a9e09ea69ac0642b4e345fd4"} Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.838146 4766 scope.go:117] "RemoveContainer" containerID="03be38dd0db64bfa4381e561ed96be54602652d0daaabe06c972e7098a6d9505" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.865690 4766 scope.go:117] "RemoveContainer" containerID="047671d421c3e3686a6bfde497ecc73c35b163e06508acad2b846fb0c5c88ec4" Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.873897 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rpjpr"] Dec 09 03:35:29 crc kubenswrapper[4766]: I1209 03:35:29.882018 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-rpjpr"] Dec 09 03:35:30 crc kubenswrapper[4766]: I1209 03:35:30.373669 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:35:30 crc kubenswrapper[4766]: I1209 03:35:30.855179 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" path="/var/lib/kubelet/pods/4e56f4d2-6f0c-4ae7-a166-ae1249d6f987/volumes" Dec 09 03:35:30 crc kubenswrapper[4766]: I1209 03:35:30.856362 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" path="/var/lib/kubelet/pods/dfb02f25-f0c2-460b-99f7-8ae10815c016/volumes" Dec 09 03:35:30 crc kubenswrapper[4766]: I1209 03:35:30.861466 4766 generic.go:334] "Generic (PLEG): container finished" podID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerID="79606849fff098e5f5d81ec4cb333c19fbcf0832408dee0e4a2616cb5d7c5c81" exitCode=0 Dec 09 03:35:30 crc kubenswrapper[4766]: I1209 03:35:30.861508 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c7f64cdcb-c5qmq" event={"ID":"852701bf-d4f9-4b3f-a2a0-76253feafe4f","Type":"ContainerDied","Data":"79606849fff098e5f5d81ec4cb333c19fbcf0832408dee0e4a2616cb5d7c5c81"} Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.375763 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.468681 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-httpd-config\") pod \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.468771 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-combined-ca-bundle\") pod \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.468873 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-config\") pod \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.468893 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-ovndb-tls-certs\") pod \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.468917 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvhq4\" (UniqueName: \"kubernetes.io/projected/852701bf-d4f9-4b3f-a2a0-76253feafe4f-kube-api-access-xvhq4\") pod \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\" (UID: \"852701bf-d4f9-4b3f-a2a0-76253feafe4f\") " Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.476340 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "852701bf-d4f9-4b3f-a2a0-76253feafe4f" (UID: "852701bf-d4f9-4b3f-a2a0-76253feafe4f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.479506 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852701bf-d4f9-4b3f-a2a0-76253feafe4f-kube-api-access-xvhq4" (OuterVolumeSpecName: "kube-api-access-xvhq4") pod "852701bf-d4f9-4b3f-a2a0-76253feafe4f" (UID: "852701bf-d4f9-4b3f-a2a0-76253feafe4f"). InnerVolumeSpecName "kube-api-access-xvhq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.538985 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-config" (OuterVolumeSpecName: "config") pod "852701bf-d4f9-4b3f-a2a0-76253feafe4f" (UID: "852701bf-d4f9-4b3f-a2a0-76253feafe4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.541738 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "852701bf-d4f9-4b3f-a2a0-76253feafe4f" (UID: "852701bf-d4f9-4b3f-a2a0-76253feafe4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.551422 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "852701bf-d4f9-4b3f-a2a0-76253feafe4f" (UID: "852701bf-d4f9-4b3f-a2a0-76253feafe4f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.570144 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.570177 4766 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.570189 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvhq4\" (UniqueName: \"kubernetes.io/projected/852701bf-d4f9-4b3f-a2a0-76253feafe4f-kube-api-access-xvhq4\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.570200 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.570214 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852701bf-d4f9-4b3f-a2a0-76253feafe4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.715615 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 09 03:35:31 crc kubenswrapper[4766]: E1209 03:35:31.716047 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerName="neutron-httpd" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716067 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerName="neutron-httpd" Dec 09 03:35:31 crc kubenswrapper[4766]: E1209 03:35:31.716081 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716089 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api" Dec 09 03:35:31 crc kubenswrapper[4766]: E1209 03:35:31.716103 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerName="neutron-api" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716113 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerName="neutron-api" Dec 09 03:35:31 crc kubenswrapper[4766]: E1209 03:35:31.716126 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api-log" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716134 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api-log" Dec 09 03:35:31 crc kubenswrapper[4766]: E1209 03:35:31.716152 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" containerName="dnsmasq-dns" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716160 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" containerName="dnsmasq-dns" Dec 09 03:35:31 crc kubenswrapper[4766]: E1209 03:35:31.716183 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" containerName="init" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716191 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" containerName="init" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716405 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerName="neutron-api" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716427 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" containerName="neutron-httpd" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716443 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716458 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e56f4d2-6f0c-4ae7-a166-ae1249d6f987" containerName="dnsmasq-dns" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.716472 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb02f25-f0c2-460b-99f7-8ae10815c016" containerName="barbican-api-log" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.717099 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.721516 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-bxn4g" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.722114 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.722302 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.745092 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.874719 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.874777 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config-secret\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.874890 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/076eae63-1ff4-4e1d-a1db-21a955f77b5f-kube-api-access-659lx\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.874952 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.878758 4766 generic.go:334] "Generic (PLEG): container finished" podID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerID="5890cc79cfa4669e7500bf99a1a8eae4ff0f229f2f447b5f5e7132a76de6180b" exitCode=0 Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.878840 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"976f83ea-558d-45e2-afec-42a12ae6c8ec","Type":"ContainerDied","Data":"5890cc79cfa4669e7500bf99a1a8eae4ff0f229f2f447b5f5e7132a76de6180b"} Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.925618 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c7f64cdcb-c5qmq" event={"ID":"852701bf-d4f9-4b3f-a2a0-76253feafe4f","Type":"ContainerDied","Data":"c81a5e6a8b8029d731f2e63d2877e5ce9ac1f3e17939fde1fdae328106493274"} Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.925693 4766 scope.go:117] "RemoveContainer" containerID="e166ebc5a4242345096ced25e89f65530d35ce71818720197b6d41e577df2045" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.926301 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c7f64cdcb-c5qmq" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.961431 4766 scope.go:117] "RemoveContainer" containerID="79606849fff098e5f5d81ec4cb333c19fbcf0832408dee0e4a2616cb5d7c5c81" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.966359 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c7f64cdcb-c5qmq"] Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.975906 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c7f64cdcb-c5qmq"] Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.977210 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/076eae63-1ff4-4e1d-a1db-21a955f77b5f-kube-api-access-659lx\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.977311 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.977447 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.977482 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config-secret\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.979071 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.991553 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config-secret\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.992124 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:31 crc kubenswrapper[4766]: I1209 03:35:31.997299 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/076eae63-1ff4-4e1d-a1db-21a955f77b5f-kube-api-access-659lx\") pod \"openstackclient\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.026422 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.027410 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.041417 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.088335 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.091156 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.099249 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 03:35:32 crc kubenswrapper[4766]: E1209 03:35:32.174573 4766 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 09 03:35:32 crc kubenswrapper[4766]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_076eae63-1ff4-4e1d-a1db-21a955f77b5f_0(a3b8cee55b8cab663a30dcc7aada0de5422d33aa62b6fbfc311cc5529228cbca): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a3b8cee55b8cab663a30dcc7aada0de5422d33aa62b6fbfc311cc5529228cbca" Netns:"/var/run/netns/2afd2c08-3008-47e2-9be2-5fc3bd307aa9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=a3b8cee55b8cab663a30dcc7aada0de5422d33aa62b6fbfc311cc5529228cbca;K8S_POD_UID=076eae63-1ff4-4e1d-a1db-21a955f77b5f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/076eae63-1ff4-4e1d-a1db-21a955f77b5f]: expected pod UID "076eae63-1ff4-4e1d-a1db-21a955f77b5f" but got "b121d489-4c67-4106-aa17-ec66f896ba25" from Kube API Dec 09 03:35:32 crc kubenswrapper[4766]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 09 03:35:32 crc kubenswrapper[4766]: > Dec 09 03:35:32 crc kubenswrapper[4766]: E1209 03:35:32.174628 4766 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 09 03:35:32 crc kubenswrapper[4766]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_076eae63-1ff4-4e1d-a1db-21a955f77b5f_0(a3b8cee55b8cab663a30dcc7aada0de5422d33aa62b6fbfc311cc5529228cbca): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a3b8cee55b8cab663a30dcc7aada0de5422d33aa62b6fbfc311cc5529228cbca" Netns:"/var/run/netns/2afd2c08-3008-47e2-9be2-5fc3bd307aa9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=a3b8cee55b8cab663a30dcc7aada0de5422d33aa62b6fbfc311cc5529228cbca;K8S_POD_UID=076eae63-1ff4-4e1d-a1db-21a955f77b5f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/076eae63-1ff4-4e1d-a1db-21a955f77b5f]: expected pod UID "076eae63-1ff4-4e1d-a1db-21a955f77b5f" but got "b121d489-4c67-4106-aa17-ec66f896ba25" from Kube API Dec 09 03:35:32 crc kubenswrapper[4766]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 09 03:35:32 crc kubenswrapper[4766]: > pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.180751 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.180798 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config-secret\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.180885 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.180939 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7t7\" (UniqueName: \"kubernetes.io/projected/b121d489-4c67-4106-aa17-ec66f896ba25-kube-api-access-dd7t7\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.282412 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.282463 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config-secret\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.282538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.282583 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7t7\" (UniqueName: \"kubernetes.io/projected/b121d489-4c67-4106-aa17-ec66f896ba25-kube-api-access-dd7t7\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.284269 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.287108 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config-secret\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.287126 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.299535 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7t7\" (UniqueName: \"kubernetes.io/projected/b121d489-4c67-4106-aa17-ec66f896ba25-kube-api-access-dd7t7\") pod \"openstackclient\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.476064 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.720433 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.858372 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="852701bf-d4f9-4b3f-a2a0-76253feafe4f" path="/var/lib/kubelet/pods/852701bf-d4f9-4b3f-a2a0-76253feafe4f/volumes" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.896455 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data\") pod \"976f83ea-558d-45e2-afec-42a12ae6c8ec\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.896535 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-combined-ca-bundle\") pod \"976f83ea-558d-45e2-afec-42a12ae6c8ec\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.896600 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/976f83ea-558d-45e2-afec-42a12ae6c8ec-etc-machine-id\") pod \"976f83ea-558d-45e2-afec-42a12ae6c8ec\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.896672 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkgl2\" (UniqueName: \"kubernetes.io/projected/976f83ea-558d-45e2-afec-42a12ae6c8ec-kube-api-access-hkgl2\") pod \"976f83ea-558d-45e2-afec-42a12ae6c8ec\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.896722 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data-custom\") pod \"976f83ea-558d-45e2-afec-42a12ae6c8ec\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.896805 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-scripts\") pod \"976f83ea-558d-45e2-afec-42a12ae6c8ec\" (UID: \"976f83ea-558d-45e2-afec-42a12ae6c8ec\") " Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.897640 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/976f83ea-558d-45e2-afec-42a12ae6c8ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "976f83ea-558d-45e2-afec-42a12ae6c8ec" (UID: "976f83ea-558d-45e2-afec-42a12ae6c8ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.901793 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976f83ea-558d-45e2-afec-42a12ae6c8ec-kube-api-access-hkgl2" (OuterVolumeSpecName: "kube-api-access-hkgl2") pod "976f83ea-558d-45e2-afec-42a12ae6c8ec" (UID: "976f83ea-558d-45e2-afec-42a12ae6c8ec"). InnerVolumeSpecName "kube-api-access-hkgl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.902163 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "976f83ea-558d-45e2-afec-42a12ae6c8ec" (UID: "976f83ea-558d-45e2-afec-42a12ae6c8ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.905404 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-scripts" (OuterVolumeSpecName: "scripts") pod "976f83ea-558d-45e2-afec-42a12ae6c8ec" (UID: "976f83ea-558d-45e2-afec-42a12ae6c8ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.950438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"976f83ea-558d-45e2-afec-42a12ae6c8ec","Type":"ContainerDied","Data":"f25abdee22f12253f223570ce2d19788f97b6a76cb798f3306ce97520e495909"} Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.950515 4766 scope.go:117] "RemoveContainer" containerID="21382bc634c266a6b0511875b9d323af3e34f39ab8c2a97bce3d7eea290f9e9d" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.950728 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.957792 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 03:35:32 crc kubenswrapper[4766]: I1209 03:35:32.978278 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "976f83ea-558d-45e2-afec-42a12ae6c8ec" (UID: "976f83ea-558d-45e2-afec-42a12ae6c8ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.000691 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.000724 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/976f83ea-558d-45e2-afec-42a12ae6c8ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.000735 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkgl2\" (UniqueName: \"kubernetes.io/projected/976f83ea-558d-45e2-afec-42a12ae6c8ec-kube-api-access-hkgl2\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.000747 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.000756 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.036439 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data" (OuterVolumeSpecName: "config-data") pod "976f83ea-558d-45e2-afec-42a12ae6c8ec" (UID: "976f83ea-558d-45e2-afec-42a12ae6c8ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:33 crc kubenswrapper[4766]: W1209 03:35:33.060027 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb121d489_4c67_4106_aa17_ec66f896ba25.slice/crio-d729325da7836f0c2f9200f9f7440f31679da79c443bdffd05c362d9644dbcd9 WatchSource:0}: Error finding container d729325da7836f0c2f9200f9f7440f31679da79c443bdffd05c362d9644dbcd9: Status 404 returned error can't find the container with id d729325da7836f0c2f9200f9f7440f31679da79c443bdffd05c362d9644dbcd9 Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.064599 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.064777 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.067328 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="076eae63-1ff4-4e1d-a1db-21a955f77b5f" podUID="b121d489-4c67-4106-aa17-ec66f896ba25" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.076000 4766 scope.go:117] "RemoveContainer" containerID="5890cc79cfa4669e7500bf99a1a8eae4ff0f229f2f447b5f5e7132a76de6180b" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.102646 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/976f83ea-558d-45e2-afec-42a12ae6c8ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.204273 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-combined-ca-bundle\") pod \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.204374 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/076eae63-1ff4-4e1d-a1db-21a955f77b5f-kube-api-access-659lx\") pod \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.204479 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config-secret\") pod \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.204631 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config\") pod \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\" (UID: \"076eae63-1ff4-4e1d-a1db-21a955f77b5f\") " Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.205528 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "076eae63-1ff4-4e1d-a1db-21a955f77b5f" (UID: "076eae63-1ff4-4e1d-a1db-21a955f77b5f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.208127 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076eae63-1ff4-4e1d-a1db-21a955f77b5f-kube-api-access-659lx" (OuterVolumeSpecName: "kube-api-access-659lx") pod "076eae63-1ff4-4e1d-a1db-21a955f77b5f" (UID: "076eae63-1ff4-4e1d-a1db-21a955f77b5f"). InnerVolumeSpecName "kube-api-access-659lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.208132 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "076eae63-1ff4-4e1d-a1db-21a955f77b5f" (UID: "076eae63-1ff4-4e1d-a1db-21a955f77b5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.210130 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "076eae63-1ff4-4e1d-a1db-21a955f77b5f" (UID: "076eae63-1ff4-4e1d-a1db-21a955f77b5f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.283992 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.292514 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.306524 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.306555 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-659lx\" (UniqueName: \"kubernetes.io/projected/076eae63-1ff4-4e1d-a1db-21a955f77b5f-kube-api-access-659lx\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.306566 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.306576 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/076eae63-1ff4-4e1d-a1db-21a955f77b5f-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.314049 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:35:33 crc kubenswrapper[4766]: E1209 03:35:33.314617 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerName="cinder-scheduler" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.314641 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerName="cinder-scheduler" Dec 09 03:35:33 crc kubenswrapper[4766]: E1209 03:35:33.314669 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerName="probe" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.314679 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerName="probe" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.314899 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerName="cinder-scheduler" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.314933 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="976f83ea-558d-45e2-afec-42a12ae6c8ec" containerName="probe" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.316111 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.318062 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.324472 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.509349 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3cf59e92-46b9-4693-b9ec-1a669b0e3897-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.509423 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.509473 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.510020 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.510098 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-scripts\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.510255 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfpl4\" (UniqueName: \"kubernetes.io/projected/3cf59e92-46b9-4693-b9ec-1a669b0e3897-kube-api-access-qfpl4\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.612147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.612201 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.612276 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-scripts\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.612442 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfpl4\" (UniqueName: \"kubernetes.io/projected/3cf59e92-46b9-4693-b9ec-1a669b0e3897-kube-api-access-qfpl4\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.612470 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3cf59e92-46b9-4693-b9ec-1a669b0e3897-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.612513 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.612654 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3cf59e92-46b9-4693-b9ec-1a669b0e3897-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.618171 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.618395 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.619734 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-scripts\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.620646 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.628687 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfpl4\" (UniqueName: \"kubernetes.io/projected/3cf59e92-46b9-4693-b9ec-1a669b0e3897-kube-api-access-qfpl4\") pod \"cinder-scheduler-0\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.642904 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.972420 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.972418 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b121d489-4c67-4106-aa17-ec66f896ba25","Type":"ContainerStarted","Data":"d729325da7836f0c2f9200f9f7440f31679da79c443bdffd05c362d9644dbcd9"} Dec 09 03:35:33 crc kubenswrapper[4766]: I1209 03:35:33.986841 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="076eae63-1ff4-4e1d-a1db-21a955f77b5f" podUID="b121d489-4c67-4106-aa17-ec66f896ba25" Dec 09 03:35:34 crc kubenswrapper[4766]: I1209 03:35:34.118556 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:35:34 crc kubenswrapper[4766]: W1209 03:35:34.126150 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cf59e92_46b9_4693_b9ec_1a669b0e3897.slice/crio-304ea017233a8d3ddf1b180def120dd60b1676c51d56e1a8e56b56306b7af19d WatchSource:0}: Error finding container 304ea017233a8d3ddf1b180def120dd60b1676c51d56e1a8e56b56306b7af19d: Status 404 returned error can't find the container with id 304ea017233a8d3ddf1b180def120dd60b1676c51d56e1a8e56b56306b7af19d Dec 09 03:35:34 crc kubenswrapper[4766]: I1209 03:35:34.857689 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076eae63-1ff4-4e1d-a1db-21a955f77b5f" path="/var/lib/kubelet/pods/076eae63-1ff4-4e1d-a1db-21a955f77b5f/volumes" Dec 09 03:35:34 crc kubenswrapper[4766]: I1209 03:35:34.858333 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="976f83ea-558d-45e2-afec-42a12ae6c8ec" path="/var/lib/kubelet/pods/976f83ea-558d-45e2-afec-42a12ae6c8ec/volumes" Dec 09 03:35:35 crc kubenswrapper[4766]: I1209 03:35:35.005101 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3cf59e92-46b9-4693-b9ec-1a669b0e3897","Type":"ContainerStarted","Data":"d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493"} Dec 09 03:35:35 crc kubenswrapper[4766]: I1209 03:35:35.005143 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3cf59e92-46b9-4693-b9ec-1a669b0e3897","Type":"ContainerStarted","Data":"304ea017233a8d3ddf1b180def120dd60b1676c51d56e1a8e56b56306b7af19d"} Dec 09 03:35:35 crc kubenswrapper[4766]: I1209 03:35:35.310786 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.016427 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3cf59e92-46b9-4693-b9ec-1a669b0e3897","Type":"ContainerStarted","Data":"fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c"} Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.055779 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.055759697 podStartE2EDuration="3.055759697s" podCreationTimestamp="2025-12-09 03:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:36.055381326 +0000 UTC m=+1417.764686762" watchObservedRunningTime="2025-12-09 03:35:36.055759697 +0000 UTC m=+1417.765065123" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.135736 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7dd946b7cc-x6vjx"] Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.138986 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.141994 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.142709 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.142835 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.157740 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7dd946b7cc-x6vjx"] Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.284453 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-config-data\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.284492 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-etc-swift\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.284561 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-run-httpd\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.284579 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-internal-tls-certs\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.284606 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8jxk\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-kube-api-access-c8jxk\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.284625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-log-httpd\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.284779 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-public-tls-certs\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.284965 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-combined-ca-bundle\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.387292 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-public-tls-certs\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.387382 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-combined-ca-bundle\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.387425 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-config-data\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.387440 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-etc-swift\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.387479 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-run-httpd\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.387494 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-internal-tls-certs\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.387514 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8jxk\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-kube-api-access-c8jxk\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.387531 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-log-httpd\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.387878 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-log-httpd\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.388559 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-run-httpd\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.392811 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-public-tls-certs\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.393115 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-combined-ca-bundle\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.396701 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-internal-tls-certs\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.398142 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-etc-swift\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.401957 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-config-data\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.411940 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8jxk\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-kube-api-access-c8jxk\") pod \"swift-proxy-7dd946b7cc-x6vjx\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.457682 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:36 crc kubenswrapper[4766]: I1209 03:35:36.859681 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7dd946b7cc-x6vjx"] Dec 09 03:35:37 crc kubenswrapper[4766]: I1209 03:35:37.028419 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" event={"ID":"2ff493c4-bb15-4a40-9499-ca23bf79f42b","Type":"ContainerStarted","Data":"5a7a3cc2e9290abbc06f47ea749606fbcae32bc77197854c9ed797c2530224b6"} Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.038905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" event={"ID":"2ff493c4-bb15-4a40-9499-ca23bf79f42b","Type":"ContainerStarted","Data":"fed7164d33b3a7af7a40fbaa9797bc168284e01eb659de885498276e5f9eafb6"} Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.039260 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" event={"ID":"2ff493c4-bb15-4a40-9499-ca23bf79f42b","Type":"ContainerStarted","Data":"f326e4c3ffff3569aafc2007ea39c3a003c9a3599ad5eed432501fa822d61ae4"} Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.039283 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.039307 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.068073 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" podStartSLOduration=2.068052187 podStartE2EDuration="2.068052187s" podCreationTimestamp="2025-12-09 03:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:38.059381653 +0000 UTC m=+1419.768687079" watchObservedRunningTime="2025-12-09 03:35:38.068052187 +0000 UTC m=+1419.777357613" Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.478288 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.478658 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="proxy-httpd" containerID="cri-o://cfdd4a4572d158e42a537050a3198fb1b562beaa7b955d01d2bf967feb6d518e" gracePeriod=30 Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.478809 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="sg-core" containerID="cri-o://7dec08f04cb360b6f65de06cb26d35c89057423b01725a696a95038fbe46625c" gracePeriod=30 Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.478864 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="ceilometer-notification-agent" containerID="cri-o://eb8a622a7ab97514c618cf772bac6f12bb0184f2ca3e77b1b88175ba51327944" gracePeriod=30 Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.478950 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="ceilometer-central-agent" containerID="cri-o://ae00c50d3de713a5eb8356e5e5eff40b598fe44ecb96003e5577b4af3098f322" gracePeriod=30 Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.496057 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": EOF" Dec 09 03:35:38 crc kubenswrapper[4766]: I1209 03:35:38.644045 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 03:35:39 crc kubenswrapper[4766]: I1209 03:35:39.050455 4766 generic.go:334] "Generic (PLEG): container finished" podID="1fb3049a-4c91-4695-bf48-303308399ccc" containerID="cfdd4a4572d158e42a537050a3198fb1b562beaa7b955d01d2bf967feb6d518e" exitCode=0 Dec 09 03:35:39 crc kubenswrapper[4766]: I1209 03:35:39.050741 4766 generic.go:334] "Generic (PLEG): container finished" podID="1fb3049a-4c91-4695-bf48-303308399ccc" containerID="7dec08f04cb360b6f65de06cb26d35c89057423b01725a696a95038fbe46625c" exitCode=2 Dec 09 03:35:39 crc kubenswrapper[4766]: I1209 03:35:39.050749 4766 generic.go:334] "Generic (PLEG): container finished" podID="1fb3049a-4c91-4695-bf48-303308399ccc" containerID="ae00c50d3de713a5eb8356e5e5eff40b598fe44ecb96003e5577b4af3098f322" exitCode=0 Dec 09 03:35:39 crc kubenswrapper[4766]: I1209 03:35:39.050543 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerDied","Data":"cfdd4a4572d158e42a537050a3198fb1b562beaa7b955d01d2bf967feb6d518e"} Dec 09 03:35:39 crc kubenswrapper[4766]: I1209 03:35:39.050858 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerDied","Data":"7dec08f04cb360b6f65de06cb26d35c89057423b01725a696a95038fbe46625c"} Dec 09 03:35:39 crc kubenswrapper[4766]: I1209 03:35:39.050881 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerDied","Data":"ae00c50d3de713a5eb8356e5e5eff40b598fe44ecb96003e5577b4af3098f322"} Dec 09 03:35:39 crc kubenswrapper[4766]: I1209 03:35:39.939840 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:35:39 crc kubenswrapper[4766]: I1209 03:35:39.940118 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerName="glance-log" containerID="cri-o://e5ccc7b26989027c52549aa00ff8006413817ad50cca148e1450a061f0fc1451" gracePeriod=30 Dec 09 03:35:39 crc kubenswrapper[4766]: I1209 03:35:39.940193 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerName="glance-httpd" containerID="cri-o://6afd4036f7abfb99f49a00c85b2b033db21b537124d7e5e086b8000010db07c4" gracePeriod=30 Dec 09 03:35:40 crc kubenswrapper[4766]: I1209 03:35:40.756901 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:35:40 crc kubenswrapper[4766]: I1209 03:35:40.757139 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerName="glance-log" containerID="cri-o://0a942c56c513f07e99a8589443f60f7079ef9d3894b9a9821bd7b1dfeda0999d" gracePeriod=30 Dec 09 03:35:40 crc kubenswrapper[4766]: I1209 03:35:40.757285 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerName="glance-httpd" containerID="cri-o://80aab41f9fbbfac91a743bb7b665b82a67505e1d541d7f949b32bd177465f98e" gracePeriod=30 Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.059206 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mwwn7"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.060603 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.083072 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mwwn7"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.084078 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqqm\" (UniqueName: \"kubernetes.io/projected/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-kube-api-access-5wqqm\") pod \"nova-api-db-create-mwwn7\" (UID: \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\") " pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.084186 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-operator-scripts\") pod \"nova-api-db-create-mwwn7\" (UID: \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\") " pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.115989 4766 generic.go:334] "Generic (PLEG): container finished" podID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerID="e5ccc7b26989027c52549aa00ff8006413817ad50cca148e1450a061f0fc1451" exitCode=143 Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.116448 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23eefead-e67a-4f5b-9a4b-f4506cd61c47","Type":"ContainerDied","Data":"e5ccc7b26989027c52549aa00ff8006413817ad50cca148e1450a061f0fc1451"} Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.135825 4766 generic.go:334] "Generic (PLEG): container finished" podID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerID="0a942c56c513f07e99a8589443f60f7079ef9d3894b9a9821bd7b1dfeda0999d" exitCode=143 Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.135889 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30e34685-6db8-46e4-9e31-9da18f1b408e","Type":"ContainerDied","Data":"0a942c56c513f07e99a8589443f60f7079ef9d3894b9a9821bd7b1dfeda0999d"} Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.163942 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qx64j"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.165308 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.176965 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-baee-account-create-update-2lz8j"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.178367 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.185946 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-operator-scripts\") pod \"nova-cell0-db-create-qx64j\" (UID: \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\") " pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.185990 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqqm\" (UniqueName: \"kubernetes.io/projected/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-kube-api-access-5wqqm\") pod \"nova-api-db-create-mwwn7\" (UID: \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\") " pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.186021 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbhb9\" (UniqueName: \"kubernetes.io/projected/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-kube-api-access-hbhb9\") pod \"nova-cell0-db-create-qx64j\" (UID: \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\") " pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.186056 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-operator-scripts\") pod \"nova-api-db-create-mwwn7\" (UID: \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\") " pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.186139 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de709648-266f-40ff-97e8-8427d71f31b6-operator-scripts\") pod \"nova-api-baee-account-create-update-2lz8j\" (UID: \"de709648-266f-40ff-97e8-8427d71f31b6\") " pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.186283 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r7tn\" (UniqueName: \"kubernetes.io/projected/de709648-266f-40ff-97e8-8427d71f31b6-kube-api-access-8r7tn\") pod \"nova-api-baee-account-create-update-2lz8j\" (UID: \"de709648-266f-40ff-97e8-8427d71f31b6\") " pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.186816 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-operator-scripts\") pod \"nova-api-db-create-mwwn7\" (UID: \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\") " pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.187285 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.190948 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qx64j"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.200628 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-baee-account-create-update-2lz8j"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.233041 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqqm\" (UniqueName: \"kubernetes.io/projected/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-kube-api-access-5wqqm\") pod \"nova-api-db-create-mwwn7\" (UID: \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\") " pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.287656 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de709648-266f-40ff-97e8-8427d71f31b6-operator-scripts\") pod \"nova-api-baee-account-create-update-2lz8j\" (UID: \"de709648-266f-40ff-97e8-8427d71f31b6\") " pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.287759 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r7tn\" (UniqueName: \"kubernetes.io/projected/de709648-266f-40ff-97e8-8427d71f31b6-kube-api-access-8r7tn\") pod \"nova-api-baee-account-create-update-2lz8j\" (UID: \"de709648-266f-40ff-97e8-8427d71f31b6\") " pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.288156 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-operator-scripts\") pod \"nova-cell0-db-create-qx64j\" (UID: \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\") " pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.288719 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de709648-266f-40ff-97e8-8427d71f31b6-operator-scripts\") pod \"nova-api-baee-account-create-update-2lz8j\" (UID: \"de709648-266f-40ff-97e8-8427d71f31b6\") " pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.288868 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-operator-scripts\") pod \"nova-cell0-db-create-qx64j\" (UID: \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\") " pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.288192 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbhb9\" (UniqueName: \"kubernetes.io/projected/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-kube-api-access-hbhb9\") pod \"nova-cell0-db-create-qx64j\" (UID: \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\") " pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.305579 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbhb9\" (UniqueName: \"kubernetes.io/projected/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-kube-api-access-hbhb9\") pod \"nova-cell0-db-create-qx64j\" (UID: \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\") " pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.315619 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r7tn\" (UniqueName: \"kubernetes.io/projected/de709648-266f-40ff-97e8-8427d71f31b6-kube-api-access-8r7tn\") pod \"nova-api-baee-account-create-update-2lz8j\" (UID: \"de709648-266f-40ff-97e8-8427d71f31b6\") " pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.354797 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dkdgh"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.356482 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.364555 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-0dd1-account-create-update-wk7mx"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.366309 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.368095 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.388390 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0dd1-account-create-update-wk7mx"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.390184 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9t9\" (UniqueName: \"kubernetes.io/projected/6e3b76d0-577e-4c52-93b9-23325fc37634-kube-api-access-8v9t9\") pod \"nova-cell0-0dd1-account-create-update-wk7mx\" (UID: \"6e3b76d0-577e-4c52-93b9-23325fc37634\") " pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.390301 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zwd\" (UniqueName: \"kubernetes.io/projected/0ff86be1-0422-4655-ba5c-b063b8c42a61-kube-api-access-z9zwd\") pod \"nova-cell1-db-create-dkdgh\" (UID: \"0ff86be1-0422-4655-ba5c-b063b8c42a61\") " pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.390345 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e3b76d0-577e-4c52-93b9-23325fc37634-operator-scripts\") pod \"nova-cell0-0dd1-account-create-update-wk7mx\" (UID: \"6e3b76d0-577e-4c52-93b9-23325fc37634\") " pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.390424 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ff86be1-0422-4655-ba5c-b063b8c42a61-operator-scripts\") pod \"nova-cell1-db-create-dkdgh\" (UID: \"0ff86be1-0422-4655-ba5c-b063b8c42a61\") " pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.399424 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.403736 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dkdgh"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.489660 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.492664 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ff86be1-0422-4655-ba5c-b063b8c42a61-operator-scripts\") pod \"nova-cell1-db-create-dkdgh\" (UID: \"0ff86be1-0422-4655-ba5c-b063b8c42a61\") " pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.492796 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9t9\" (UniqueName: \"kubernetes.io/projected/6e3b76d0-577e-4c52-93b9-23325fc37634-kube-api-access-8v9t9\") pod \"nova-cell0-0dd1-account-create-update-wk7mx\" (UID: \"6e3b76d0-577e-4c52-93b9-23325fc37634\") " pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.492847 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zwd\" (UniqueName: \"kubernetes.io/projected/0ff86be1-0422-4655-ba5c-b063b8c42a61-kube-api-access-z9zwd\") pod \"nova-cell1-db-create-dkdgh\" (UID: \"0ff86be1-0422-4655-ba5c-b063b8c42a61\") " pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.492888 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e3b76d0-577e-4c52-93b9-23325fc37634-operator-scripts\") pod \"nova-cell0-0dd1-account-create-update-wk7mx\" (UID: \"6e3b76d0-577e-4c52-93b9-23325fc37634\") " pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.493625 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ff86be1-0422-4655-ba5c-b063b8c42a61-operator-scripts\") pod \"nova-cell1-db-create-dkdgh\" (UID: \"0ff86be1-0422-4655-ba5c-b063b8c42a61\") " pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.493703 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e3b76d0-577e-4c52-93b9-23325fc37634-operator-scripts\") pod \"nova-cell0-0dd1-account-create-update-wk7mx\" (UID: \"6e3b76d0-577e-4c52-93b9-23325fc37634\") " pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.499148 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.508547 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zwd\" (UniqueName: \"kubernetes.io/projected/0ff86be1-0422-4655-ba5c-b063b8c42a61-kube-api-access-z9zwd\") pod \"nova-cell1-db-create-dkdgh\" (UID: \"0ff86be1-0422-4655-ba5c-b063b8c42a61\") " pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.528420 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9t9\" (UniqueName: \"kubernetes.io/projected/6e3b76d0-577e-4c52-93b9-23325fc37634-kube-api-access-8v9t9\") pod \"nova-cell0-0dd1-account-create-update-wk7mx\" (UID: \"6e3b76d0-577e-4c52-93b9-23325fc37634\") " pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.556574 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-64c0-account-create-update-5fb2k"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.557763 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.560147 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.565042 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-64c0-account-create-update-5fb2k"] Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.696764 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0c079d8-5ee5-42ba-ad34-165566762c81-operator-scripts\") pod \"nova-cell1-64c0-account-create-update-5fb2k\" (UID: \"c0c079d8-5ee5-42ba-ad34-165566762c81\") " pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.696830 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpdgl\" (UniqueName: \"kubernetes.io/projected/c0c079d8-5ee5-42ba-ad34-165566762c81-kube-api-access-cpdgl\") pod \"nova-cell1-64c0-account-create-update-5fb2k\" (UID: \"c0c079d8-5ee5-42ba-ad34-165566762c81\") " pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.720437 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.747263 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.798247 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0c079d8-5ee5-42ba-ad34-165566762c81-operator-scripts\") pod \"nova-cell1-64c0-account-create-update-5fb2k\" (UID: \"c0c079d8-5ee5-42ba-ad34-165566762c81\") " pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.798386 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpdgl\" (UniqueName: \"kubernetes.io/projected/c0c079d8-5ee5-42ba-ad34-165566762c81-kube-api-access-cpdgl\") pod \"nova-cell1-64c0-account-create-update-5fb2k\" (UID: \"c0c079d8-5ee5-42ba-ad34-165566762c81\") " pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.799325 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0c079d8-5ee5-42ba-ad34-165566762c81-operator-scripts\") pod \"nova-cell1-64c0-account-create-update-5fb2k\" (UID: \"c0c079d8-5ee5-42ba-ad34-165566762c81\") " pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.818355 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpdgl\" (UniqueName: \"kubernetes.io/projected/c0c079d8-5ee5-42ba-ad34-165566762c81-kube-api-access-cpdgl\") pod \"nova-cell1-64c0-account-create-update-5fb2k\" (UID: \"c0c079d8-5ee5-42ba-ad34-165566762c81\") " pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:41 crc kubenswrapper[4766]: I1209 03:35:41.903921 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:42 crc kubenswrapper[4766]: I1209 03:35:42.154609 4766 generic.go:334] "Generic (PLEG): container finished" podID="1fb3049a-4c91-4695-bf48-303308399ccc" containerID="eb8a622a7ab97514c618cf772bac6f12bb0184f2ca3e77b1b88175ba51327944" exitCode=0 Dec 09 03:35:42 crc kubenswrapper[4766]: I1209 03:35:42.154681 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerDied","Data":"eb8a622a7ab97514c618cf772bac6f12bb0184f2ca3e77b1b88175ba51327944"} Dec 09 03:35:42 crc kubenswrapper[4766]: I1209 03:35:42.925654 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": dial tcp 10.217.0.156:3000: connect: connection refused" Dec 09 03:35:43 crc kubenswrapper[4766]: I1209 03:35:43.868150 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 03:35:44 crc kubenswrapper[4766]: I1209 03:35:44.175022 4766 generic.go:334] "Generic (PLEG): container finished" podID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerID="6afd4036f7abfb99f49a00c85b2b033db21b537124d7e5e086b8000010db07c4" exitCode=0 Dec 09 03:35:44 crc kubenswrapper[4766]: I1209 03:35:44.175096 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23eefead-e67a-4f5b-9a4b-f4506cd61c47","Type":"ContainerDied","Data":"6afd4036f7abfb99f49a00c85b2b033db21b537124d7e5e086b8000010db07c4"} Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.220422 4766 generic.go:334] "Generic (PLEG): container finished" podID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerID="80aab41f9fbbfac91a743bb7b665b82a67505e1d541d7f949b32bd177465f98e" exitCode=0 Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.220574 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30e34685-6db8-46e4-9e31-9da18f1b408e","Type":"ContainerDied","Data":"80aab41f9fbbfac91a743bb7b665b82a67505e1d541d7f949b32bd177465f98e"} Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.266341 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.465853 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-combined-ca-bundle\") pod \"1fb3049a-4c91-4695-bf48-303308399ccc\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.466210 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9bp5\" (UniqueName: \"kubernetes.io/projected/1fb3049a-4c91-4695-bf48-303308399ccc-kube-api-access-q9bp5\") pod \"1fb3049a-4c91-4695-bf48-303308399ccc\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.466294 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-config-data\") pod \"1fb3049a-4c91-4695-bf48-303308399ccc\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.466341 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-log-httpd\") pod \"1fb3049a-4c91-4695-bf48-303308399ccc\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.466368 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-scripts\") pod \"1fb3049a-4c91-4695-bf48-303308399ccc\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.466398 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-sg-core-conf-yaml\") pod \"1fb3049a-4c91-4695-bf48-303308399ccc\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.466516 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-run-httpd\") pod \"1fb3049a-4c91-4695-bf48-303308399ccc\" (UID: \"1fb3049a-4c91-4695-bf48-303308399ccc\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.467447 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1fb3049a-4c91-4695-bf48-303308399ccc" (UID: "1fb3049a-4c91-4695-bf48-303308399ccc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.468192 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1fb3049a-4c91-4695-bf48-303308399ccc" (UID: "1fb3049a-4c91-4695-bf48-303308399ccc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.473639 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb3049a-4c91-4695-bf48-303308399ccc-kube-api-access-q9bp5" (OuterVolumeSpecName: "kube-api-access-q9bp5") pod "1fb3049a-4c91-4695-bf48-303308399ccc" (UID: "1fb3049a-4c91-4695-bf48-303308399ccc"). InnerVolumeSpecName "kube-api-access-q9bp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.475085 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-scripts" (OuterVolumeSpecName: "scripts") pod "1fb3049a-4c91-4695-bf48-303308399ccc" (UID: "1fb3049a-4c91-4695-bf48-303308399ccc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.527977 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.545432 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1fb3049a-4c91-4695-bf48-303308399ccc" (UID: "1fb3049a-4c91-4695-bf48-303308399ccc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.585099 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-public-tls-certs\") pod \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.585417 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.585482 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-scripts\") pod \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.585553 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4qtm\" (UniqueName: \"kubernetes.io/projected/23eefead-e67a-4f5b-9a4b-f4506cd61c47-kube-api-access-d4qtm\") pod \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.585652 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-config-data\") pod \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.585722 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-combined-ca-bundle\") pod \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.585773 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-httpd-run\") pod \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.586027 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-logs\") pod \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\" (UID: \"23eefead-e67a-4f5b-9a4b-f4506cd61c47\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.588227 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.588257 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9bp5\" (UniqueName: \"kubernetes.io/projected/1fb3049a-4c91-4695-bf48-303308399ccc-kube-api-access-q9bp5\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.588270 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fb3049a-4c91-4695-bf48-303308399ccc-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.588282 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.588296 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.588895 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-logs" (OuterVolumeSpecName: "logs") pod "23eefead-e67a-4f5b-9a4b-f4506cd61c47" (UID: "23eefead-e67a-4f5b-9a4b-f4506cd61c47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.593791 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "23eefead-e67a-4f5b-9a4b-f4506cd61c47" (UID: "23eefead-e67a-4f5b-9a4b-f4506cd61c47"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.600433 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23eefead-e67a-4f5b-9a4b-f4506cd61c47-kube-api-access-d4qtm" (OuterVolumeSpecName: "kube-api-access-d4qtm") pod "23eefead-e67a-4f5b-9a4b-f4506cd61c47" (UID: "23eefead-e67a-4f5b-9a4b-f4506cd61c47"). InnerVolumeSpecName "kube-api-access-d4qtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.600857 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-scripts" (OuterVolumeSpecName: "scripts") pod "23eefead-e67a-4f5b-9a4b-f4506cd61c47" (UID: "23eefead-e67a-4f5b-9a4b-f4506cd61c47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.607448 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "23eefead-e67a-4f5b-9a4b-f4506cd61c47" (UID: "23eefead-e67a-4f5b-9a4b-f4506cd61c47"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.655977 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.663291 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fb3049a-4c91-4695-bf48-303308399ccc" (UID: "1fb3049a-4c91-4695-bf48-303308399ccc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.687776 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23eefead-e67a-4f5b-9a4b-f4506cd61c47" (UID: "23eefead-e67a-4f5b-9a4b-f4506cd61c47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.698472 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.698538 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.699033 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.699085 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4qtm\" (UniqueName: \"kubernetes.io/projected/23eefead-e67a-4f5b-9a4b-f4506cd61c47-kube-api-access-d4qtm\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.699145 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.699193 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.699260 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23eefead-e67a-4f5b-9a4b-f4506cd61c47-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.700657 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "23eefead-e67a-4f5b-9a4b-f4506cd61c47" (UID: "23eefead-e67a-4f5b-9a4b-f4506cd61c47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.701394 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-config-data" (OuterVolumeSpecName: "config-data") pod "1fb3049a-4c91-4695-bf48-303308399ccc" (UID: "1fb3049a-4c91-4695-bf48-303308399ccc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.707431 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-config-data" (OuterVolumeSpecName: "config-data") pod "23eefead-e67a-4f5b-9a4b-f4506cd61c47" (UID: "23eefead-e67a-4f5b-9a4b-f4506cd61c47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.723042 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.757545 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dkdgh"] Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.800439 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-internal-tls-certs\") pod \"30e34685-6db8-46e4-9e31-9da18f1b408e\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.800873 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-logs\") pod \"30e34685-6db8-46e4-9e31-9da18f1b408e\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.800902 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-config-data\") pod \"30e34685-6db8-46e4-9e31-9da18f1b408e\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.800945 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz4wc\" (UniqueName: \"kubernetes.io/projected/30e34685-6db8-46e4-9e31-9da18f1b408e-kube-api-access-tz4wc\") pod \"30e34685-6db8-46e4-9e31-9da18f1b408e\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.800983 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"30e34685-6db8-46e4-9e31-9da18f1b408e\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.801011 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-scripts\") pod \"30e34685-6db8-46e4-9e31-9da18f1b408e\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.801157 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-httpd-run\") pod \"30e34685-6db8-46e4-9e31-9da18f1b408e\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.801185 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-combined-ca-bundle\") pod \"30e34685-6db8-46e4-9e31-9da18f1b408e\" (UID: \"30e34685-6db8-46e4-9e31-9da18f1b408e\") " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.801438 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-logs" (OuterVolumeSpecName: "logs") pod "30e34685-6db8-46e4-9e31-9da18f1b408e" (UID: "30e34685-6db8-46e4-9e31-9da18f1b408e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.801695 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.801712 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fb3049a-4c91-4695-bf48-303308399ccc-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.801723 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.801734 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.801744 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23eefead-e67a-4f5b-9a4b-f4506cd61c47-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.802561 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "30e34685-6db8-46e4-9e31-9da18f1b408e" (UID: "30e34685-6db8-46e4-9e31-9da18f1b408e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.814614 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-scripts" (OuterVolumeSpecName: "scripts") pod "30e34685-6db8-46e4-9e31-9da18f1b408e" (UID: "30e34685-6db8-46e4-9e31-9da18f1b408e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.814639 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e34685-6db8-46e4-9e31-9da18f1b408e-kube-api-access-tz4wc" (OuterVolumeSpecName: "kube-api-access-tz4wc") pod "30e34685-6db8-46e4-9e31-9da18f1b408e" (UID: "30e34685-6db8-46e4-9e31-9da18f1b408e"). InnerVolumeSpecName "kube-api-access-tz4wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.831651 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "30e34685-6db8-46e4-9e31-9da18f1b408e" (UID: "30e34685-6db8-46e4-9e31-9da18f1b408e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.860755 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30e34685-6db8-46e4-9e31-9da18f1b408e" (UID: "30e34685-6db8-46e4-9e31-9da18f1b408e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.879894 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-config-data" (OuterVolumeSpecName: "config-data") pod "30e34685-6db8-46e4-9e31-9da18f1b408e" (UID: "30e34685-6db8-46e4-9e31-9da18f1b408e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.893440 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30e34685-6db8-46e4-9e31-9da18f1b408e" (UID: "30e34685-6db8-46e4-9e31-9da18f1b408e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.905110 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.905143 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.905154 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e34685-6db8-46e4-9e31-9da18f1b408e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.905164 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.905176 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.905185 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e34685-6db8-46e4-9e31-9da18f1b408e-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.905193 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz4wc\" (UniqueName: \"kubernetes.io/projected/30e34685-6db8-46e4-9e31-9da18f1b408e-kube-api-access-tz4wc\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:45 crc kubenswrapper[4766]: I1209 03:35:45.930624 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.006317 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.174370 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-baee-account-create-update-2lz8j"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.182255 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qx64j"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.197386 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-64c0-account-create-update-5fb2k"] Dec 09 03:35:46 crc kubenswrapper[4766]: W1209 03:35:46.200875 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d03cec_423f_4fd0_b08e_17dfaa4cc8d6.slice/crio-bd28c21bb4ac8c5ced4feb94c016dd5e738461a288774b7e5fd928c10f65feb7 WatchSource:0}: Error finding container bd28c21bb4ac8c5ced4feb94c016dd5e738461a288774b7e5fd928c10f65feb7: Status 404 returned error can't find the container with id bd28c21bb4ac8c5ced4feb94c016dd5e738461a288774b7e5fd928c10f65feb7 Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.206414 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0dd1-account-create-update-wk7mx"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.215723 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mwwn7"] Dec 09 03:35:46 crc kubenswrapper[4766]: W1209 03:35:46.225437 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c9ae005_e7a2_4af5_9ff6_48f3364c4496.slice/crio-88fe32b78c3a91cf51ee04d843ddb3732c80f2822bead18ba25821ae3f7e401a WatchSource:0}: Error finding container 88fe32b78c3a91cf51ee04d843ddb3732c80f2822bead18ba25821ae3f7e401a: Status 404 returned error can't find the container with id 88fe32b78c3a91cf51ee04d843ddb3732c80f2822bead18ba25821ae3f7e401a Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.284448 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-baee-account-create-update-2lz8j" event={"ID":"de709648-266f-40ff-97e8-8427d71f31b6","Type":"ContainerStarted","Data":"d9386d4f4e18946138ba4e529198d7652c3a00468b408ccb44d8569a2477430b"} Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.289198 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b121d489-4c67-4106-aa17-ec66f896ba25","Type":"ContainerStarted","Data":"1454c674a2602e750c4beddf30cf919ff742bdcc3255241923700f7fa0a09ef8"} Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.296810 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fb3049a-4c91-4695-bf48-303308399ccc","Type":"ContainerDied","Data":"dbb19d82afc22e867c96efa180cf9e791f9c0deab84af51f05e1906492a9caae"} Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.296879 4766 scope.go:117] "RemoveContainer" containerID="cfdd4a4572d158e42a537050a3198fb1b562beaa7b955d01d2bf967feb6d518e" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.296933 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.322043 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23eefead-e67a-4f5b-9a4b-f4506cd61c47","Type":"ContainerDied","Data":"6d9ab65940b4124adbd9232d30ffdbedd2b53cf54f3419a2201977483cc7cf20"} Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.322097 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.323205 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.403959941 podStartE2EDuration="14.323187227s" podCreationTimestamp="2025-12-09 03:35:32 +0000 UTC" firstStartedPulling="2025-12-09 03:35:33.076187104 +0000 UTC m=+1414.785492530" lastFinishedPulling="2025-12-09 03:35:44.99541439 +0000 UTC m=+1426.704719816" observedRunningTime="2025-12-09 03:35:46.304810642 +0000 UTC m=+1428.014116168" watchObservedRunningTime="2025-12-09 03:35:46.323187227 +0000 UTC m=+1428.032492653" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.325149 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" event={"ID":"c0c079d8-5ee5-42ba-ad34-165566762c81","Type":"ContainerStarted","Data":"53588b4e7c0c2112837390eca33b49c7c5293522315321fc30df59e4f752d166"} Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.333129 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dkdgh" event={"ID":"0ff86be1-0422-4655-ba5c-b063b8c42a61","Type":"ContainerStarted","Data":"835b157de5befc6459566491e52b9afa85e9df4adea8de16cab9a07bdb847600"} Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.335764 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qx64j" event={"ID":"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6","Type":"ContainerStarted","Data":"bd28c21bb4ac8c5ced4feb94c016dd5e738461a288774b7e5fd928c10f65feb7"} Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.340681 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"30e34685-6db8-46e4-9e31-9da18f1b408e","Type":"ContainerDied","Data":"a0bd2a4bdf99d1e1c4b411b4b2e61f894e775b5f543f15b8c10014925015e337"} Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.340914 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.389098 4766 scope.go:117] "RemoveContainer" containerID="7dec08f04cb360b6f65de06cb26d35c89057423b01725a696a95038fbe46625c" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.394112 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-dkdgh" podStartSLOduration=5.394092025 podStartE2EDuration="5.394092025s" podCreationTimestamp="2025-12-09 03:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:46.352370812 +0000 UTC m=+1428.061676238" watchObservedRunningTime="2025-12-09 03:35:46.394092025 +0000 UTC m=+1428.103397451" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.439230 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.479279 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.499779 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.513815 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.518524 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: E1209 03:35:46.518877 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="proxy-httpd" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.518888 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="proxy-httpd" Dec 09 03:35:46 crc kubenswrapper[4766]: E1209 03:35:46.518897 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="ceilometer-central-agent" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.518903 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="ceilometer-central-agent" Dec 09 03:35:46 crc kubenswrapper[4766]: E1209 03:35:46.518923 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerName="glance-log" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.518928 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerName="glance-log" Dec 09 03:35:46 crc kubenswrapper[4766]: E1209 03:35:46.518938 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="ceilometer-notification-agent" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.518944 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="ceilometer-notification-agent" Dec 09 03:35:46 crc kubenswrapper[4766]: E1209 03:35:46.518956 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerName="glance-httpd" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.519088 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerName="glance-httpd" Dec 09 03:35:46 crc kubenswrapper[4766]: E1209 03:35:46.519112 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerName="glance-log" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.519118 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerName="glance-log" Dec 09 03:35:46 crc kubenswrapper[4766]: E1209 03:35:46.519125 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerName="glance-httpd" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.519130 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerName="glance-httpd" Dec 09 03:35:46 crc kubenswrapper[4766]: E1209 03:35:46.519142 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="sg-core" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.519148 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="sg-core" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.522455 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="sg-core" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.522484 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="ceilometer-central-agent" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.522497 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerName="glance-httpd" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.522505 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="ceilometer-notification-agent" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.522517 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" containerName="proxy-httpd" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.522528 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerName="glance-httpd" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.522538 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" containerName="glance-log" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.522547 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e34685-6db8-46e4-9e31-9da18f1b408e" containerName="glance-log" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.523463 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.527726 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.527953 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.528269 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hqfv8" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.528328 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.530551 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.543834 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.558695 4766 scope.go:117] "RemoveContainer" containerID="eb8a622a7ab97514c618cf772bac6f12bb0184f2ca3e77b1b88175ba51327944" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.567549 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.594558 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.611177 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.614560 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.620581 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.622671 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.624102 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.640548 4766 scope.go:117] "RemoveContainer" containerID="ae00c50d3de713a5eb8356e5e5eff40b598fe44ecb96003e5577b4af3098f322" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.643538 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.655062 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.656949 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.671045 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.672307 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.679921 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724166 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-scripts\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724244 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64h75\" (UniqueName: \"kubernetes.io/projected/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-kube-api-access-64h75\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724327 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724361 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724386 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724487 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-logs\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724506 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724533 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwc7v\" (UniqueName: \"kubernetes.io/projected/c6a00c8b-af47-4254-83de-a93a975b3afe-kube-api-access-hwc7v\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724568 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-config-data\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724586 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-log-httpd\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-config-data\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724671 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724714 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-scripts\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724772 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-run-httpd\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.724824 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.777310 4766 scope.go:117] "RemoveContainer" containerID="6afd4036f7abfb99f49a00c85b2b033db21b537124d7e5e086b8000010db07c4" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826431 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-logs\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826506 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-logs\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826535 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826561 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwc7v\" (UniqueName: \"kubernetes.io/projected/c6a00c8b-af47-4254-83de-a93a975b3afe-kube-api-access-hwc7v\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826582 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826608 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-config-data\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826631 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-log-httpd\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826660 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-config-data\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826688 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826725 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-scripts\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826758 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826779 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-run-httpd\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826804 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktgkw\" (UniqueName: \"kubernetes.io/projected/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-kube-api-access-ktgkw\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826844 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826883 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826934 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-scripts\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826971 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64h75\" (UniqueName: \"kubernetes.io/projected/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-kube-api-access-64h75\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.826985 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.827019 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.827049 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.827074 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.827107 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.827130 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.827151 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.827796 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-logs\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.828113 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-log-httpd\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.829862 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.830155 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-run-httpd\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.836396 4766 scope.go:117] "RemoveContainer" containerID="e5ccc7b26989027c52549aa00ff8006413817ad50cca148e1450a061f0fc1451" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.840433 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.843180 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-config-data\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.845101 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-scripts\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.846148 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.846299 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-config-data\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.848232 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.848986 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64h75\" (UniqueName: \"kubernetes.io/projected/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-kube-api-access-64h75\") pod \"ceilometer-0\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " pod="openstack/ceilometer-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.850065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-scripts\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.858199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.860568 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb3049a-4c91-4695-bf48-303308399ccc" path="/var/lib/kubelet/pods/1fb3049a-4c91-4695-bf48-303308399ccc/volumes" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.861628 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23eefead-e67a-4f5b-9a4b-f4506cd61c47" path="/var/lib/kubelet/pods/23eefead-e67a-4f5b-9a4b-f4506cd61c47/volumes" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.863499 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e34685-6db8-46e4-9e31-9da18f1b408e" path="/var/lib/kubelet/pods/30e34685-6db8-46e4-9e31-9da18f1b408e/volumes" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.871773 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwc7v\" (UniqueName: \"kubernetes.io/projected/c6a00c8b-af47-4254-83de-a93a975b3afe-kube-api-access-hwc7v\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.928759 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.928803 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktgkw\" (UniqueName: \"kubernetes.io/projected/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-kube-api-access-ktgkw\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.928849 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.928952 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.928969 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.928982 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.928998 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-logs\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.929039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.931099 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.936290 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.936546 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.936641 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.938754 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-logs\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.940122 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.950121 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.953440 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktgkw\" (UniqueName: \"kubernetes.io/projected/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-kube-api-access-ktgkw\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.975146 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " pod="openstack/glance-default-external-api-0" Dec 09 03:35:46 crc kubenswrapper[4766]: I1209 03:35:46.987500 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:46.999722 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " pod="openstack/glance-default-internal-api-0" Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.019718 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.044325 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.360376 4766 generic.go:334] "Generic (PLEG): container finished" podID="26d03cec-423f-4fd0-b08e-17dfaa4cc8d6" containerID="fba4f4072456bf794d0f60bf93f26a45e302fb3f1e1abcf3524f28c79c148914" exitCode=0 Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.360944 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qx64j" event={"ID":"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6","Type":"ContainerDied","Data":"fba4f4072456bf794d0f60bf93f26a45e302fb3f1e1abcf3524f28c79c148914"} Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.363683 4766 generic.go:334] "Generic (PLEG): container finished" podID="8c9ae005-e7a2-4af5-9ff6-48f3364c4496" containerID="6a9c046486c1c5dab899d5d0aba1033437d6e7306b2ba8194f3216464695d8cc" exitCode=0 Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.363735 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mwwn7" event={"ID":"8c9ae005-e7a2-4af5-9ff6-48f3364c4496","Type":"ContainerDied","Data":"6a9c046486c1c5dab899d5d0aba1033437d6e7306b2ba8194f3216464695d8cc"} Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.363755 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mwwn7" event={"ID":"8c9ae005-e7a2-4af5-9ff6-48f3364c4496","Type":"ContainerStarted","Data":"88fe32b78c3a91cf51ee04d843ddb3732c80f2822bead18ba25821ae3f7e401a"} Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.380345 4766 generic.go:334] "Generic (PLEG): container finished" podID="de709648-266f-40ff-97e8-8427d71f31b6" containerID="a76bb5c8ee791c900387340a2d3fc6fcb013d905d053c5740dcf188f5500cce9" exitCode=0 Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.380431 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-baee-account-create-update-2lz8j" event={"ID":"de709648-266f-40ff-97e8-8427d71f31b6","Type":"ContainerDied","Data":"a76bb5c8ee791c900387340a2d3fc6fcb013d905d053c5740dcf188f5500cce9"} Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.392533 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" event={"ID":"6e3b76d0-577e-4c52-93b9-23325fc37634","Type":"ContainerStarted","Data":"31d4aef01e5d6494a763f4846694b59d4c729b3f2ea71927992000f13a953678"} Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.392589 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" event={"ID":"6e3b76d0-577e-4c52-93b9-23325fc37634","Type":"ContainerStarted","Data":"1baf4020ac90a407cb36f400e181431dca90221386ce4f987ff6515a55dd9111"} Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.409828 4766 generic.go:334] "Generic (PLEG): container finished" podID="c0c079d8-5ee5-42ba-ad34-165566762c81" containerID="2e75fee47cbaa033a97f9c0a2fead9cf5720b793210e021d133929ac9e7853c5" exitCode=0 Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.409923 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" event={"ID":"c0c079d8-5ee5-42ba-ad34-165566762c81","Type":"ContainerDied","Data":"2e75fee47cbaa033a97f9c0a2fead9cf5720b793210e021d133929ac9e7853c5"} Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.411460 4766 generic.go:334] "Generic (PLEG): container finished" podID="0ff86be1-0422-4655-ba5c-b063b8c42a61" containerID="e25b6d52f84a5db4c92a5ad1a166d1f5cf47e474cd768f741c58bbd00f159311" exitCode=0 Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.412058 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dkdgh" event={"ID":"0ff86be1-0422-4655-ba5c-b063b8c42a61","Type":"ContainerDied","Data":"e25b6d52f84a5db4c92a5ad1a166d1f5cf47e474cd768f741c58bbd00f159311"} Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.412904 4766 scope.go:117] "RemoveContainer" containerID="80aab41f9fbbfac91a743bb7b665b82a67505e1d541d7f949b32bd177465f98e" Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.424232 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" podStartSLOduration=6.42419541 podStartE2EDuration="6.42419541s" podCreationTimestamp="2025-12-09 03:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:47.415136025 +0000 UTC m=+1429.124441451" watchObservedRunningTime="2025-12-09 03:35:47.42419541 +0000 UTC m=+1429.133500836" Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.490037 4766 scope.go:117] "RemoveContainer" containerID="0a942c56c513f07e99a8589443f60f7079ef9d3894b9a9821bd7b1dfeda0999d" Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.635627 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.639650 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.706714 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:35:47 crc kubenswrapper[4766]: W1209 03:35:47.707094 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6a00c8b_af47_4254_83de_a93a975b3afe.slice/crio-55470ae3ef794b26c2d7e26506f8171544701644da67acf668e94742dc50fa32 WatchSource:0}: Error finding container 55470ae3ef794b26c2d7e26506f8171544701644da67acf668e94742dc50fa32: Status 404 returned error can't find the container with id 55470ae3ef794b26c2d7e26506f8171544701644da67acf668e94742dc50fa32 Dec 09 03:35:47 crc kubenswrapper[4766]: I1209 03:35:47.862085 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:35:47 crc kubenswrapper[4766]: W1209 03:35:47.877775 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9edd6e7b_9841_43be_9478_5e7d06d8bd8d.slice/crio-fbaa25ec7000e8f35a0d95d5eec00b43998c95f95c727becb44ac2ba43f72698 WatchSource:0}: Error finding container fbaa25ec7000e8f35a0d95d5eec00b43998c95f95c727becb44ac2ba43f72698: Status 404 returned error can't find the container with id fbaa25ec7000e8f35a0d95d5eec00b43998c95f95c727becb44ac2ba43f72698 Dec 09 03:35:48 crc kubenswrapper[4766]: I1209 03:35:48.438764 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerStarted","Data":"7b540fafd23f952dd7eecce24389fc8acf0aacdef5e741da4962f3bebd0896c5"} Dec 09 03:35:48 crc kubenswrapper[4766]: I1209 03:35:48.439043 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerStarted","Data":"9f7f2cb05641f9b99ce63220bd89df4ca17800a07fe672c0158b81cf7d225244"} Dec 09 03:35:48 crc kubenswrapper[4766]: I1209 03:35:48.444618 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6a00c8b-af47-4254-83de-a93a975b3afe","Type":"ContainerStarted","Data":"4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271"} Dec 09 03:35:48 crc kubenswrapper[4766]: I1209 03:35:48.444644 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6a00c8b-af47-4254-83de-a93a975b3afe","Type":"ContainerStarted","Data":"55470ae3ef794b26c2d7e26506f8171544701644da67acf668e94742dc50fa32"} Dec 09 03:35:48 crc kubenswrapper[4766]: I1209 03:35:48.464528 4766 generic.go:334] "Generic (PLEG): container finished" podID="6e3b76d0-577e-4c52-93b9-23325fc37634" containerID="31d4aef01e5d6494a763f4846694b59d4c729b3f2ea71927992000f13a953678" exitCode=0 Dec 09 03:35:48 crc kubenswrapper[4766]: I1209 03:35:48.464879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" event={"ID":"6e3b76d0-577e-4c52-93b9-23325fc37634","Type":"ContainerDied","Data":"31d4aef01e5d6494a763f4846694b59d4c729b3f2ea71927992000f13a953678"} Dec 09 03:35:48 crc kubenswrapper[4766]: I1209 03:35:48.465980 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9edd6e7b-9841-43be-9478-5e7d06d8bd8d","Type":"ContainerStarted","Data":"fbaa25ec7000e8f35a0d95d5eec00b43998c95f95c727becb44ac2ba43f72698"} Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.101747 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.229762 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de709648-266f-40ff-97e8-8427d71f31b6-operator-scripts\") pod \"de709648-266f-40ff-97e8-8427d71f31b6\" (UID: \"de709648-266f-40ff-97e8-8427d71f31b6\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.230051 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r7tn\" (UniqueName: \"kubernetes.io/projected/de709648-266f-40ff-97e8-8427d71f31b6-kube-api-access-8r7tn\") pod \"de709648-266f-40ff-97e8-8427d71f31b6\" (UID: \"de709648-266f-40ff-97e8-8427d71f31b6\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.231833 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de709648-266f-40ff-97e8-8427d71f31b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de709648-266f-40ff-97e8-8427d71f31b6" (UID: "de709648-266f-40ff-97e8-8427d71f31b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.299014 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de709648-266f-40ff-97e8-8427d71f31b6-kube-api-access-8r7tn" (OuterVolumeSpecName: "kube-api-access-8r7tn") pod "de709648-266f-40ff-97e8-8427d71f31b6" (UID: "de709648-266f-40ff-97e8-8427d71f31b6"). InnerVolumeSpecName "kube-api-access-8r7tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.335485 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de709648-266f-40ff-97e8-8427d71f31b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.335528 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r7tn\" (UniqueName: \"kubernetes.io/projected/de709648-266f-40ff-97e8-8427d71f31b6-kube-api-access-8r7tn\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.413473 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.427004 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.467738 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.519289 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9edd6e7b-9841-43be-9478-5e7d06d8bd8d","Type":"ContainerStarted","Data":"60568c7893411368ef7934b6d7f5d2360db7af5753dc4500e120b6309331f0b4"} Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.523708 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.528956 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.528961 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-64c0-account-create-update-5fb2k" event={"ID":"c0c079d8-5ee5-42ba-ad34-165566762c81","Type":"ContainerDied","Data":"53588b4e7c0c2112837390eca33b49c7c5293522315321fc30df59e4f752d166"} Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.530721 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53588b4e7c0c2112837390eca33b49c7c5293522315321fc30df59e4f752d166" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.536291 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dkdgh" event={"ID":"0ff86be1-0422-4655-ba5c-b063b8c42a61","Type":"ContainerDied","Data":"835b157de5befc6459566491e52b9afa85e9df4adea8de16cab9a07bdb847600"} Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.536325 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="835b157de5befc6459566491e52b9afa85e9df4adea8de16cab9a07bdb847600" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.536328 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dkdgh" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.541150 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ff86be1-0422-4655-ba5c-b063b8c42a61-operator-scripts\") pod \"0ff86be1-0422-4655-ba5c-b063b8c42a61\" (UID: \"0ff86be1-0422-4655-ba5c-b063b8c42a61\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.541250 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9zwd\" (UniqueName: \"kubernetes.io/projected/0ff86be1-0422-4655-ba5c-b063b8c42a61-kube-api-access-z9zwd\") pod \"0ff86be1-0422-4655-ba5c-b063b8c42a61\" (UID: \"0ff86be1-0422-4655-ba5c-b063b8c42a61\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.541285 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpdgl\" (UniqueName: \"kubernetes.io/projected/c0c079d8-5ee5-42ba-ad34-165566762c81-kube-api-access-cpdgl\") pod \"c0c079d8-5ee5-42ba-ad34-165566762c81\" (UID: \"c0c079d8-5ee5-42ba-ad34-165566762c81\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.541316 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0c079d8-5ee5-42ba-ad34-165566762c81-operator-scripts\") pod \"c0c079d8-5ee5-42ba-ad34-165566762c81\" (UID: \"c0c079d8-5ee5-42ba-ad34-165566762c81\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.544971 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0c079d8-5ee5-42ba-ad34-165566762c81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0c079d8-5ee5-42ba-ad34-165566762c81" (UID: "c0c079d8-5ee5-42ba-ad34-165566762c81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.548360 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ff86be1-0422-4655-ba5c-b063b8c42a61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ff86be1-0422-4655-ba5c-b063b8c42a61" (UID: "0ff86be1-0422-4655-ba5c-b063b8c42a61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.575585 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ff86be1-0422-4655-ba5c-b063b8c42a61-kube-api-access-z9zwd" (OuterVolumeSpecName: "kube-api-access-z9zwd") pod "0ff86be1-0422-4655-ba5c-b063b8c42a61" (UID: "0ff86be1-0422-4655-ba5c-b063b8c42a61"). InnerVolumeSpecName "kube-api-access-z9zwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.586487 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c079d8-5ee5-42ba-ad34-165566762c81-kube-api-access-cpdgl" (OuterVolumeSpecName: "kube-api-access-cpdgl") pod "c0c079d8-5ee5-42ba-ad34-165566762c81" (UID: "c0c079d8-5ee5-42ba-ad34-165566762c81"). InnerVolumeSpecName "kube-api-access-cpdgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.603249 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mwwn7" event={"ID":"8c9ae005-e7a2-4af5-9ff6-48f3364c4496","Type":"ContainerDied","Data":"88fe32b78c3a91cf51ee04d843ddb3732c80f2822bead18ba25821ae3f7e401a"} Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.603291 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88fe32b78c3a91cf51ee04d843ddb3732c80f2822bead18ba25821ae3f7e401a" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.603345 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mwwn7" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.614416 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-baee-account-create-update-2lz8j" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.614588 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-baee-account-create-update-2lz8j" event={"ID":"de709648-266f-40ff-97e8-8427d71f31b6","Type":"ContainerDied","Data":"d9386d4f4e18946138ba4e529198d7652c3a00468b408ccb44d8569a2477430b"} Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.614636 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9386d4f4e18946138ba4e529198d7652c3a00468b408ccb44d8569a2477430b" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.645128 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbhb9\" (UniqueName: \"kubernetes.io/projected/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-kube-api-access-hbhb9\") pod \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\" (UID: \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.645518 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wqqm\" (UniqueName: \"kubernetes.io/projected/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-kube-api-access-5wqqm\") pod \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\" (UID: \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.645624 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-operator-scripts\") pod \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\" (UID: \"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.645699 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-operator-scripts\") pod \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\" (UID: \"8c9ae005-e7a2-4af5-9ff6-48f3364c4496\") " Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.646196 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ff86be1-0422-4655-ba5c-b063b8c42a61-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.646245 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9zwd\" (UniqueName: \"kubernetes.io/projected/0ff86be1-0422-4655-ba5c-b063b8c42a61-kube-api-access-z9zwd\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.646260 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpdgl\" (UniqueName: \"kubernetes.io/projected/c0c079d8-5ee5-42ba-ad34-165566762c81-kube-api-access-cpdgl\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.646272 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0c079d8-5ee5-42ba-ad34-165566762c81-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.648334 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c9ae005-e7a2-4af5-9ff6-48f3364c4496" (UID: "8c9ae005-e7a2-4af5-9ff6-48f3364c4496"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.649849 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26d03cec-423f-4fd0-b08e-17dfaa4cc8d6" (UID: "26d03cec-423f-4fd0-b08e-17dfaa4cc8d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.652869 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-kube-api-access-hbhb9" (OuterVolumeSpecName: "kube-api-access-hbhb9") pod "26d03cec-423f-4fd0-b08e-17dfaa4cc8d6" (UID: "26d03cec-423f-4fd0-b08e-17dfaa4cc8d6"). InnerVolumeSpecName "kube-api-access-hbhb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.655204 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-kube-api-access-5wqqm" (OuterVolumeSpecName: "kube-api-access-5wqqm") pod "8c9ae005-e7a2-4af5-9ff6-48f3364c4496" (UID: "8c9ae005-e7a2-4af5-9ff6-48f3364c4496"). InnerVolumeSpecName "kube-api-access-5wqqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.747461 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbhb9\" (UniqueName: \"kubernetes.io/projected/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-kube-api-access-hbhb9\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.747492 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wqqm\" (UniqueName: \"kubernetes.io/projected/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-kube-api-access-5wqqm\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.747502 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:49 crc kubenswrapper[4766]: I1209 03:35:49.747515 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9ae005-e7a2-4af5-9ff6-48f3364c4496-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.046865 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.173355 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e3b76d0-577e-4c52-93b9-23325fc37634-operator-scripts\") pod \"6e3b76d0-577e-4c52-93b9-23325fc37634\" (UID: \"6e3b76d0-577e-4c52-93b9-23325fc37634\") " Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.173526 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v9t9\" (UniqueName: \"kubernetes.io/projected/6e3b76d0-577e-4c52-93b9-23325fc37634-kube-api-access-8v9t9\") pod \"6e3b76d0-577e-4c52-93b9-23325fc37634\" (UID: \"6e3b76d0-577e-4c52-93b9-23325fc37634\") " Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.174239 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e3b76d0-577e-4c52-93b9-23325fc37634-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e3b76d0-577e-4c52-93b9-23325fc37634" (UID: "6e3b76d0-577e-4c52-93b9-23325fc37634"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.185173 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3b76d0-577e-4c52-93b9-23325fc37634-kube-api-access-8v9t9" (OuterVolumeSpecName: "kube-api-access-8v9t9") pod "6e3b76d0-577e-4c52-93b9-23325fc37634" (UID: "6e3b76d0-577e-4c52-93b9-23325fc37634"). InnerVolumeSpecName "kube-api-access-8v9t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.275925 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e3b76d0-577e-4c52-93b9-23325fc37634-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.276059 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v9t9\" (UniqueName: \"kubernetes.io/projected/6e3b76d0-577e-4c52-93b9-23325fc37634-kube-api-access-8v9t9\") on node \"crc\" DevicePath \"\"" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.630387 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerStarted","Data":"bce8f13b6da1df5a3143a68312e109839b40f3c0f958a8580a94df4609278eae"} Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.630711 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerStarted","Data":"3c89f9a2e9538f12fc2b2254f5df2a9194866a2908d3742a82a50168850ab6d2"} Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.633248 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qx64j" event={"ID":"26d03cec-423f-4fd0-b08e-17dfaa4cc8d6","Type":"ContainerDied","Data":"bd28c21bb4ac8c5ced4feb94c016dd5e738461a288774b7e5fd928c10f65feb7"} Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.633280 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd28c21bb4ac8c5ced4feb94c016dd5e738461a288774b7e5fd928c10f65feb7" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.633388 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qx64j" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.635021 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6a00c8b-af47-4254-83de-a93a975b3afe","Type":"ContainerStarted","Data":"d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40"} Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.637969 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" event={"ID":"6e3b76d0-577e-4c52-93b9-23325fc37634","Type":"ContainerDied","Data":"1baf4020ac90a407cb36f400e181431dca90221386ce4f987ff6515a55dd9111"} Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.638009 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1baf4020ac90a407cb36f400e181431dca90221386ce4f987ff6515a55dd9111" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.637978 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0dd1-account-create-update-wk7mx" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.639754 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9edd6e7b-9841-43be-9478-5e7d06d8bd8d","Type":"ContainerStarted","Data":"f77816b92b5912c6b1c7dc678c8c50b01d826733809db88019e22a9ee0865718"} Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.662580 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.662563728 podStartE2EDuration="4.662563728s" podCreationTimestamp="2025-12-09 03:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:50.658903419 +0000 UTC m=+1432.368208845" watchObservedRunningTime="2025-12-09 03:35:50.662563728 +0000 UTC m=+1432.371869154" Dec 09 03:35:50 crc kubenswrapper[4766]: I1209 03:35:50.696699 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.696677726 podStartE2EDuration="4.696677726s" podCreationTimestamp="2025-12-09 03:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:35:50.692281878 +0000 UTC m=+1432.401587314" watchObservedRunningTime="2025-12-09 03:35:50.696677726 +0000 UTC m=+1432.405983162" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.606638 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5f62v"] Dec 09 03:35:51 crc kubenswrapper[4766]: E1209 03:35:51.607254 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3b76d0-577e-4c52-93b9-23325fc37634" containerName="mariadb-account-create-update" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607278 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3b76d0-577e-4c52-93b9-23325fc37634" containerName="mariadb-account-create-update" Dec 09 03:35:51 crc kubenswrapper[4766]: E1209 03:35:51.607295 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff86be1-0422-4655-ba5c-b063b8c42a61" containerName="mariadb-database-create" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607306 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff86be1-0422-4655-ba5c-b063b8c42a61" containerName="mariadb-database-create" Dec 09 03:35:51 crc kubenswrapper[4766]: E1209 03:35:51.607317 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c9ae005-e7a2-4af5-9ff6-48f3364c4496" containerName="mariadb-database-create" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607325 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9ae005-e7a2-4af5-9ff6-48f3364c4496" containerName="mariadb-database-create" Dec 09 03:35:51 crc kubenswrapper[4766]: E1209 03:35:51.607346 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d03cec-423f-4fd0-b08e-17dfaa4cc8d6" containerName="mariadb-database-create" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607354 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d03cec-423f-4fd0-b08e-17dfaa4cc8d6" containerName="mariadb-database-create" Dec 09 03:35:51 crc kubenswrapper[4766]: E1209 03:35:51.607372 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de709648-266f-40ff-97e8-8427d71f31b6" containerName="mariadb-account-create-update" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607379 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de709648-266f-40ff-97e8-8427d71f31b6" containerName="mariadb-account-create-update" Dec 09 03:35:51 crc kubenswrapper[4766]: E1209 03:35:51.607395 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c079d8-5ee5-42ba-ad34-165566762c81" containerName="mariadb-account-create-update" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607402 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c079d8-5ee5-42ba-ad34-165566762c81" containerName="mariadb-account-create-update" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607620 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="de709648-266f-40ff-97e8-8427d71f31b6" containerName="mariadb-account-create-update" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607642 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d03cec-423f-4fd0-b08e-17dfaa4cc8d6" containerName="mariadb-database-create" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607659 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c079d8-5ee5-42ba-ad34-165566762c81" containerName="mariadb-account-create-update" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607681 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c9ae005-e7a2-4af5-9ff6-48f3364c4496" containerName="mariadb-database-create" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607693 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3b76d0-577e-4c52-93b9-23325fc37634" containerName="mariadb-account-create-update" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.607715 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff86be1-0422-4655-ba5c-b063b8c42a61" containerName="mariadb-database-create" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.608257 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.613546 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.613835 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.621257 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v6kg7" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.621980 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5f62v"] Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.676610 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerStarted","Data":"47a846277e34f92904c27c557f5c4afbf2717ccce47ec249edd14dac38e54e99"} Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.676648 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.700144 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9lgd\" (UniqueName: \"kubernetes.io/projected/1c8159d8-7758-4a35-ba5c-51ce1edae988-kube-api-access-r9lgd\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.700479 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-config-data\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.700605 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.700753 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-scripts\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.700987 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.987559392 podStartE2EDuration="5.700967396s" podCreationTimestamp="2025-12-09 03:35:46 +0000 UTC" firstStartedPulling="2025-12-09 03:35:47.639449043 +0000 UTC m=+1429.348754459" lastFinishedPulling="2025-12-09 03:35:51.352857037 +0000 UTC m=+1433.062162463" observedRunningTime="2025-12-09 03:35:51.69741411 +0000 UTC m=+1433.406719556" watchObservedRunningTime="2025-12-09 03:35:51.700967396 +0000 UTC m=+1433.410272822" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.802396 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9lgd\" (UniqueName: \"kubernetes.io/projected/1c8159d8-7758-4a35-ba5c-51ce1edae988-kube-api-access-r9lgd\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.802555 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-config-data\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.802612 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.802683 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-scripts\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.807866 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.809118 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-scripts\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.815813 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-config-data\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.821907 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9lgd\" (UniqueName: \"kubernetes.io/projected/1c8159d8-7758-4a35-ba5c-51ce1edae988-kube-api-access-r9lgd\") pod \"nova-cell0-conductor-db-sync-5f62v\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:51 crc kubenswrapper[4766]: I1209 03:35:51.927470 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:35:52 crc kubenswrapper[4766]: I1209 03:35:52.415483 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5f62v"] Dec 09 03:35:52 crc kubenswrapper[4766]: W1209 03:35:52.433568 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c8159d8_7758_4a35_ba5c_51ce1edae988.slice/crio-36ff8fe199791b7e06f5fa6030e8351f883ea05f94e5d8e965e56f43d8d8f419 WatchSource:0}: Error finding container 36ff8fe199791b7e06f5fa6030e8351f883ea05f94e5d8e965e56f43d8d8f419: Status 404 returned error can't find the container with id 36ff8fe199791b7e06f5fa6030e8351f883ea05f94e5d8e965e56f43d8d8f419 Dec 09 03:35:52 crc kubenswrapper[4766]: I1209 03:35:52.684566 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5f62v" event={"ID":"1c8159d8-7758-4a35-ba5c-51ce1edae988","Type":"ContainerStarted","Data":"36ff8fe199791b7e06f5fa6030e8351f883ea05f94e5d8e965e56f43d8d8f419"} Dec 09 03:35:56 crc kubenswrapper[4766]: I1209 03:35:56.990433 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 03:35:56 crc kubenswrapper[4766]: I1209 03:35:56.990957 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.023878 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.037392 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.046552 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.046613 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.094484 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.109180 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.739373 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.739784 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.739804 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 03:35:57 crc kubenswrapper[4766]: I1209 03:35:57.740140 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 03:35:59 crc kubenswrapper[4766]: I1209 03:35:59.755181 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 03:35:59 crc kubenswrapper[4766]: I1209 03:35:59.755501 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 03:35:59 crc kubenswrapper[4766]: I1209 03:35:59.755186 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 03:35:59 crc kubenswrapper[4766]: I1209 03:35:59.755623 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 03:35:59 crc kubenswrapper[4766]: I1209 03:35:59.807674 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.222571 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.231367 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.231690 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.429080 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.429438 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="ceilometer-central-agent" containerID="cri-o://7b540fafd23f952dd7eecce24389fc8acf0aacdef5e741da4962f3bebd0896c5" gracePeriod=30 Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.429981 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="proxy-httpd" containerID="cri-o://47a846277e34f92904c27c557f5c4afbf2717ccce47ec249edd14dac38e54e99" gracePeriod=30 Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.430065 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="sg-core" containerID="cri-o://bce8f13b6da1df5a3143a68312e109839b40f3c0f958a8580a94df4609278eae" gracePeriod=30 Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.430109 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="ceilometer-notification-agent" containerID="cri-o://3c89f9a2e9538f12fc2b2254f5df2a9194866a2908d3742a82a50168850ab6d2" gracePeriod=30 Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.767443 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerID="47a846277e34f92904c27c557f5c4afbf2717ccce47ec249edd14dac38e54e99" exitCode=0 Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.767736 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerID="bce8f13b6da1df5a3143a68312e109839b40f3c0f958a8580a94df4609278eae" exitCode=2 Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.768448 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerDied","Data":"47a846277e34f92904c27c557f5c4afbf2717ccce47ec249edd14dac38e54e99"} Dec 09 03:36:00 crc kubenswrapper[4766]: I1209 03:36:00.768540 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerDied","Data":"bce8f13b6da1df5a3143a68312e109839b40f3c0f958a8580a94df4609278eae"} Dec 09 03:36:01 crc kubenswrapper[4766]: I1209 03:36:01.779682 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerID="7b540fafd23f952dd7eecce24389fc8acf0aacdef5e741da4962f3bebd0896c5" exitCode=0 Dec 09 03:36:01 crc kubenswrapper[4766]: I1209 03:36:01.779727 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerDied","Data":"7b540fafd23f952dd7eecce24389fc8acf0aacdef5e741da4962f3bebd0896c5"} Dec 09 03:36:02 crc kubenswrapper[4766]: I1209 03:36:02.804008 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerID="3c89f9a2e9538f12fc2b2254f5df2a9194866a2908d3742a82a50168850ab6d2" exitCode=0 Dec 09 03:36:02 crc kubenswrapper[4766]: I1209 03:36:02.804107 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerDied","Data":"3c89f9a2e9538f12fc2b2254f5df2a9194866a2908d3742a82a50168850ab6d2"} Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.465369 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.641360 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-config-data\") pod \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.641700 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-log-httpd\") pod \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.641820 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-run-httpd\") pod \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.642497 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-scripts\") pod \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.642617 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64h75\" (UniqueName: \"kubernetes.io/projected/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-kube-api-access-64h75\") pod \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.642878 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8e70c8f1-c33a-4cc6-b1f8-758e35d00903" (UID: "8e70c8f1-c33a-4cc6-b1f8-758e35d00903"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.643211 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8e70c8f1-c33a-4cc6-b1f8-758e35d00903" (UID: "8e70c8f1-c33a-4cc6-b1f8-758e35d00903"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.643636 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-combined-ca-bundle\") pod \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.643786 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-sg-core-conf-yaml\") pod \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.645958 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.645984 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.663426 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-scripts" (OuterVolumeSpecName: "scripts") pod "8e70c8f1-c33a-4cc6-b1f8-758e35d00903" (UID: "8e70c8f1-c33a-4cc6-b1f8-758e35d00903"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.667448 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-kube-api-access-64h75" (OuterVolumeSpecName: "kube-api-access-64h75") pod "8e70c8f1-c33a-4cc6-b1f8-758e35d00903" (UID: "8e70c8f1-c33a-4cc6-b1f8-758e35d00903"). InnerVolumeSpecName "kube-api-access-64h75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.708310 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8e70c8f1-c33a-4cc6-b1f8-758e35d00903" (UID: "8e70c8f1-c33a-4cc6-b1f8-758e35d00903"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.746306 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e70c8f1-c33a-4cc6-b1f8-758e35d00903" (UID: "8e70c8f1-c33a-4cc6-b1f8-758e35d00903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.746848 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-combined-ca-bundle\") pod \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\" (UID: \"8e70c8f1-c33a-4cc6-b1f8-758e35d00903\") " Dec 09 03:36:03 crc kubenswrapper[4766]: W1209 03:36:03.747024 4766 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8e70c8f1-c33a-4cc6-b1f8-758e35d00903/volumes/kubernetes.io~secret/combined-ca-bundle Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.747057 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e70c8f1-c33a-4cc6-b1f8-758e35d00903" (UID: "8e70c8f1-c33a-4cc6-b1f8-758e35d00903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.747365 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.747381 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.747392 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.747402 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64h75\" (UniqueName: \"kubernetes.io/projected/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-kube-api-access-64h75\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.776456 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-config-data" (OuterVolumeSpecName: "config-data") pod "8e70c8f1-c33a-4cc6-b1f8-758e35d00903" (UID: "8e70c8f1-c33a-4cc6-b1f8-758e35d00903"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.815912 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5f62v" event={"ID":"1c8159d8-7758-4a35-ba5c-51ce1edae988","Type":"ContainerStarted","Data":"d9fab7dfeb3826cd0321cfd7d0feacdb6d86c47207cf4911ed9ec23c6780f0a1"} Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.820106 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8e70c8f1-c33a-4cc6-b1f8-758e35d00903","Type":"ContainerDied","Data":"9f7f2cb05641f9b99ce63220bd89df4ca17800a07fe672c0158b81cf7d225244"} Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.820153 4766 scope.go:117] "RemoveContainer" containerID="47a846277e34f92904c27c557f5c4afbf2717ccce47ec249edd14dac38e54e99" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.820337 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.837169 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5f62v" podStartSLOduration=1.880267177 podStartE2EDuration="12.837147365s" podCreationTimestamp="2025-12-09 03:35:51 +0000 UTC" firstStartedPulling="2025-12-09 03:35:52.435879686 +0000 UTC m=+1434.145185112" lastFinishedPulling="2025-12-09 03:36:03.392759864 +0000 UTC m=+1445.102065300" observedRunningTime="2025-12-09 03:36:03.831951685 +0000 UTC m=+1445.541257111" watchObservedRunningTime="2025-12-09 03:36:03.837147365 +0000 UTC m=+1445.546452801" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.850326 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e70c8f1-c33a-4cc6-b1f8-758e35d00903-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.858450 4766 scope.go:117] "RemoveContainer" containerID="bce8f13b6da1df5a3143a68312e109839b40f3c0f958a8580a94df4609278eae" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.876118 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.888386 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.905547 4766 scope.go:117] "RemoveContainer" containerID="3c89f9a2e9538f12fc2b2254f5df2a9194866a2908d3742a82a50168850ab6d2" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.937631 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:03 crc kubenswrapper[4766]: E1209 03:36:03.938110 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="ceilometer-notification-agent" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.938131 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="ceilometer-notification-agent" Dec 09 03:36:03 crc kubenswrapper[4766]: E1209 03:36:03.938161 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="sg-core" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.938168 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="sg-core" Dec 09 03:36:03 crc kubenswrapper[4766]: E1209 03:36:03.938184 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="ceilometer-central-agent" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.938190 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="ceilometer-central-agent" Dec 09 03:36:03 crc kubenswrapper[4766]: E1209 03:36:03.938202 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="proxy-httpd" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.938207 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="proxy-httpd" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.938430 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="sg-core" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.938457 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="proxy-httpd" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.938469 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="ceilometer-notification-agent" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.938478 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" containerName="ceilometer-central-agent" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.940232 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.942364 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.946420 4766 scope.go:117] "RemoveContainer" containerID="7b540fafd23f952dd7eecce24389fc8acf0aacdef5e741da4962f3bebd0896c5" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.946793 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 03:36:03 crc kubenswrapper[4766]: I1209 03:36:03.964691 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.054463 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.054593 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-run-httpd\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.054697 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-log-httpd\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.054802 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-scripts\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.054873 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spl2n\" (UniqueName: \"kubernetes.io/projected/e4e1674a-ac59-47c4-b004-6a867f666328-kube-api-access-spl2n\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.055000 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.055099 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-config-data\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.119869 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:04 crc kubenswrapper[4766]: E1209 03:36:04.120516 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-spl2n log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="e4e1674a-ac59-47c4-b004-6a867f666328" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.156378 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-scripts\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.156434 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spl2n\" (UniqueName: \"kubernetes.io/projected/e4e1674a-ac59-47c4-b004-6a867f666328-kube-api-access-spl2n\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.156469 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.156509 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-config-data\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.156552 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.156582 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-run-httpd\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.156603 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-log-httpd\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.157028 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-log-httpd\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.157123 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-run-httpd\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.160065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-scripts\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.161929 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.162104 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.162453 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-config-data\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.174052 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spl2n\" (UniqueName: \"kubernetes.io/projected/e4e1674a-ac59-47c4-b004-6a867f666328-kube-api-access-spl2n\") pod \"ceilometer-0\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.831331 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.844330 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.850899 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e70c8f1-c33a-4cc6-b1f8-758e35d00903" path="/var/lib/kubelet/pods/8e70c8f1-c33a-4cc6-b1f8-758e35d00903/volumes" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.969321 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-sg-core-conf-yaml\") pod \"e4e1674a-ac59-47c4-b004-6a867f666328\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.969411 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-run-httpd\") pod \"e4e1674a-ac59-47c4-b004-6a867f666328\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.969620 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-log-httpd\") pod \"e4e1674a-ac59-47c4-b004-6a867f666328\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.969692 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4e1674a-ac59-47c4-b004-6a867f666328" (UID: "e4e1674a-ac59-47c4-b004-6a867f666328"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.970121 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4e1674a-ac59-47c4-b004-6a867f666328" (UID: "e4e1674a-ac59-47c4-b004-6a867f666328"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.970419 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-combined-ca-bundle\") pod \"e4e1674a-ac59-47c4-b004-6a867f666328\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.970480 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spl2n\" (UniqueName: \"kubernetes.io/projected/e4e1674a-ac59-47c4-b004-6a867f666328-kube-api-access-spl2n\") pod \"e4e1674a-ac59-47c4-b004-6a867f666328\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.970545 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-config-data\") pod \"e4e1674a-ac59-47c4-b004-6a867f666328\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.971206 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-scripts\") pod \"e4e1674a-ac59-47c4-b004-6a867f666328\" (UID: \"e4e1674a-ac59-47c4-b004-6a867f666328\") " Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.973075 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.973131 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e1674a-ac59-47c4-b004-6a867f666328-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.975795 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e1674a-ac59-47c4-b004-6a867f666328-kube-api-access-spl2n" (OuterVolumeSpecName: "kube-api-access-spl2n") pod "e4e1674a-ac59-47c4-b004-6a867f666328" (UID: "e4e1674a-ac59-47c4-b004-6a867f666328"). InnerVolumeSpecName "kube-api-access-spl2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.975802 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4e1674a-ac59-47c4-b004-6a867f666328" (UID: "e4e1674a-ac59-47c4-b004-6a867f666328"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.979580 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-config-data" (OuterVolumeSpecName: "config-data") pod "e4e1674a-ac59-47c4-b004-6a867f666328" (UID: "e4e1674a-ac59-47c4-b004-6a867f666328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.990146 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4e1674a-ac59-47c4-b004-6a867f666328" (UID: "e4e1674a-ac59-47c4-b004-6a867f666328"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:04 crc kubenswrapper[4766]: I1209 03:36:04.993414 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-scripts" (OuterVolumeSpecName: "scripts") pod "e4e1674a-ac59-47c4-b004-6a867f666328" (UID: "e4e1674a-ac59-47c4-b004-6a867f666328"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.075388 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.075429 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.075442 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spl2n\" (UniqueName: \"kubernetes.io/projected/e4e1674a-ac59-47c4-b004-6a867f666328-kube-api-access-spl2n\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.075457 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.075470 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e1674a-ac59-47c4-b004-6a867f666328-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.840629 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.928541 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.942515 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.951575 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.956146 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.959585 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.960316 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.987203 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.993732 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.993780 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-config-data\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.993862 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.993907 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-log-httpd\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.993928 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-run-httpd\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.993951 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvc9c\" (UniqueName: \"kubernetes.io/projected/61498550-edcf-4bf5-8528-458189c6b171-kube-api-access-cvc9c\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:05 crc kubenswrapper[4766]: I1209 03:36:05.993977 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-scripts\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.095416 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-log-httpd\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.095472 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-run-httpd\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.095503 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvc9c\" (UniqueName: \"kubernetes.io/projected/61498550-edcf-4bf5-8528-458189c6b171-kube-api-access-cvc9c\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.095531 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-scripts\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.095617 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.095648 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-config-data\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.095735 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.096335 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-log-httpd\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.096632 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-run-httpd\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.100724 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-config-data\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.101953 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.107490 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-scripts\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.111723 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.119735 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvc9c\" (UniqueName: \"kubernetes.io/projected/61498550-edcf-4bf5-8528-458189c6b171-kube-api-access-cvc9c\") pod \"ceilometer-0\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.281238 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.735743 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:06 crc kubenswrapper[4766]: W1209 03:36:06.742149 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61498550_edcf_4bf5_8528_458189c6b171.slice/crio-0ef240330be81bb6fd5b0ae89c2ddf06811cbba2ef5220a089c7b4266ccb9f17 WatchSource:0}: Error finding container 0ef240330be81bb6fd5b0ae89c2ddf06811cbba2ef5220a089c7b4266ccb9f17: Status 404 returned error can't find the container with id 0ef240330be81bb6fd5b0ae89c2ddf06811cbba2ef5220a089c7b4266ccb9f17 Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.851815 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e1674a-ac59-47c4-b004-6a867f666328" path="/var/lib/kubelet/pods/e4e1674a-ac59-47c4-b004-6a867f666328/volumes" Dec 09 03:36:06 crc kubenswrapper[4766]: I1209 03:36:06.852299 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerStarted","Data":"0ef240330be81bb6fd5b0ae89c2ddf06811cbba2ef5220a089c7b4266ccb9f17"} Dec 09 03:36:07 crc kubenswrapper[4766]: I1209 03:36:07.316542 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:36:07 crc kubenswrapper[4766]: I1209 03:36:07.316846 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:36:07 crc kubenswrapper[4766]: I1209 03:36:07.870380 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerStarted","Data":"449d2dddaa0f97deefa2998a70a2009afc79446167e01a0e225a949bcd70b905"} Dec 09 03:36:08 crc kubenswrapper[4766]: I1209 03:36:08.884490 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerStarted","Data":"aa1b6bc2d208e72f0a55decbffc820ca0e6a6e6ddd531379c4f4b69be452c224"} Dec 09 03:36:08 crc kubenswrapper[4766]: I1209 03:36:08.885081 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerStarted","Data":"3db5b2f3ce4dc714b8a30ad8ec2e3babbb0b659bc2980b720781766e884a614d"} Dec 09 03:36:09 crc kubenswrapper[4766]: I1209 03:36:09.894280 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerStarted","Data":"949ff15f71d458a94f6dfdcf0fdf4392ce78c50c4a716dd4dcce6614bb75c2ca"} Dec 09 03:36:09 crc kubenswrapper[4766]: I1209 03:36:09.895450 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 03:36:09 crc kubenswrapper[4766]: I1209 03:36:09.925512 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.096029263 podStartE2EDuration="4.925485968s" podCreationTimestamp="2025-12-09 03:36:05 +0000 UTC" firstStartedPulling="2025-12-09 03:36:06.744723229 +0000 UTC m=+1448.454028665" lastFinishedPulling="2025-12-09 03:36:09.574179944 +0000 UTC m=+1451.283485370" observedRunningTime="2025-12-09 03:36:09.917974896 +0000 UTC m=+1451.627280312" watchObservedRunningTime="2025-12-09 03:36:09.925485968 +0000 UTC m=+1451.634791394" Dec 09 03:36:14 crc kubenswrapper[4766]: I1209 03:36:14.952637 4766 generic.go:334] "Generic (PLEG): container finished" podID="1c8159d8-7758-4a35-ba5c-51ce1edae988" containerID="d9fab7dfeb3826cd0321cfd7d0feacdb6d86c47207cf4911ed9ec23c6780f0a1" exitCode=0 Dec 09 03:36:14 crc kubenswrapper[4766]: I1209 03:36:14.953207 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5f62v" event={"ID":"1c8159d8-7758-4a35-ba5c-51ce1edae988","Type":"ContainerDied","Data":"d9fab7dfeb3826cd0321cfd7d0feacdb6d86c47207cf4911ed9ec23c6780f0a1"} Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.306540 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.314457 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dqgtt"] Dec 09 03:36:16 crc kubenswrapper[4766]: E1209 03:36:16.314945 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8159d8-7758-4a35-ba5c-51ce1edae988" containerName="nova-cell0-conductor-db-sync" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.314964 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8159d8-7758-4a35-ba5c-51ce1edae988" containerName="nova-cell0-conductor-db-sync" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.315157 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8159d8-7758-4a35-ba5c-51ce1edae988" containerName="nova-cell0-conductor-db-sync" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.316458 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.325625 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dqgtt"] Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.491151 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-scripts\") pod \"1c8159d8-7758-4a35-ba5c-51ce1edae988\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.491416 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9lgd\" (UniqueName: \"kubernetes.io/projected/1c8159d8-7758-4a35-ba5c-51ce1edae988-kube-api-access-r9lgd\") pod \"1c8159d8-7758-4a35-ba5c-51ce1edae988\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.491452 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-combined-ca-bundle\") pod \"1c8159d8-7758-4a35-ba5c-51ce1edae988\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.491493 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-config-data\") pod \"1c8159d8-7758-4a35-ba5c-51ce1edae988\" (UID: \"1c8159d8-7758-4a35-ba5c-51ce1edae988\") " Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.491973 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-catalog-content\") pod \"redhat-operators-dqgtt\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.492011 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-utilities\") pod \"redhat-operators-dqgtt\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.492047 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gvfm\" (UniqueName: \"kubernetes.io/projected/0b8094ea-63d8-4974-94d7-55e33813ed16-kube-api-access-9gvfm\") pod \"redhat-operators-dqgtt\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.498172 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-scripts" (OuterVolumeSpecName: "scripts") pod "1c8159d8-7758-4a35-ba5c-51ce1edae988" (UID: "1c8159d8-7758-4a35-ba5c-51ce1edae988"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.504386 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8159d8-7758-4a35-ba5c-51ce1edae988-kube-api-access-r9lgd" (OuterVolumeSpecName: "kube-api-access-r9lgd") pod "1c8159d8-7758-4a35-ba5c-51ce1edae988" (UID: "1c8159d8-7758-4a35-ba5c-51ce1edae988"). InnerVolumeSpecName "kube-api-access-r9lgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.519340 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-config-data" (OuterVolumeSpecName: "config-data") pod "1c8159d8-7758-4a35-ba5c-51ce1edae988" (UID: "1c8159d8-7758-4a35-ba5c-51ce1edae988"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.523399 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c8159d8-7758-4a35-ba5c-51ce1edae988" (UID: "1c8159d8-7758-4a35-ba5c-51ce1edae988"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.593672 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-catalog-content\") pod \"redhat-operators-dqgtt\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.593713 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-utilities\") pod \"redhat-operators-dqgtt\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.593739 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gvfm\" (UniqueName: \"kubernetes.io/projected/0b8094ea-63d8-4974-94d7-55e33813ed16-kube-api-access-9gvfm\") pod \"redhat-operators-dqgtt\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.593864 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9lgd\" (UniqueName: \"kubernetes.io/projected/1c8159d8-7758-4a35-ba5c-51ce1edae988-kube-api-access-r9lgd\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.593878 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.593888 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.593897 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c8159d8-7758-4a35-ba5c-51ce1edae988-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.594150 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-catalog-content\") pod \"redhat-operators-dqgtt\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.594204 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-utilities\") pod \"redhat-operators-dqgtt\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.612037 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gvfm\" (UniqueName: \"kubernetes.io/projected/0b8094ea-63d8-4974-94d7-55e33813ed16-kube-api-access-9gvfm\") pod \"redhat-operators-dqgtt\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.637992 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.973587 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5f62v" event={"ID":"1c8159d8-7758-4a35-ba5c-51ce1edae988","Type":"ContainerDied","Data":"36ff8fe199791b7e06f5fa6030e8351f883ea05f94e5d8e965e56f43d8d8f419"} Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.973880 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36ff8fe199791b7e06f5fa6030e8351f883ea05f94e5d8e965e56f43d8d8f419" Dec 09 03:36:16 crc kubenswrapper[4766]: I1209 03:36:16.973665 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5f62v" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.085833 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.087533 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.089367 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-v6kg7" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.090040 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.101186 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.132918 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dqgtt"] Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.203589 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.203806 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.203854 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxbhp\" (UniqueName: \"kubernetes.io/projected/ec41cb84-c47e-4199-ac5d-825bbf4f7023-kube-api-access-mxbhp\") pod \"nova-cell0-conductor-0\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.305601 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.305797 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.305844 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxbhp\" (UniqueName: \"kubernetes.io/projected/ec41cb84-c47e-4199-ac5d-825bbf4f7023-kube-api-access-mxbhp\") pod \"nova-cell0-conductor-0\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.310964 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.313115 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.324774 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxbhp\" (UniqueName: \"kubernetes.io/projected/ec41cb84-c47e-4199-ac5d-825bbf4f7023-kube-api-access-mxbhp\") pod \"nova-cell0-conductor-0\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.403743 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.974967 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.984473 4766 generic.go:334] "Generic (PLEG): container finished" podID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerID="aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8" exitCode=0 Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.984531 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqgtt" event={"ID":"0b8094ea-63d8-4974-94d7-55e33813ed16","Type":"ContainerDied","Data":"aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8"} Dec 09 03:36:17 crc kubenswrapper[4766]: I1209 03:36:17.984767 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqgtt" event={"ID":"0b8094ea-63d8-4974-94d7-55e33813ed16","Type":"ContainerStarted","Data":"26695224dda7e6e041746b65f8447547db567ad9ae2501f15892369daf687825"} Dec 09 03:36:18 crc kubenswrapper[4766]: I1209 03:36:18.997605 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ec41cb84-c47e-4199-ac5d-825bbf4f7023","Type":"ContainerStarted","Data":"d9c17096fe1163fbded2f9420441c595a2d9ae76cf7dab37fbf8a31e46afc79e"} Dec 09 03:36:19 crc kubenswrapper[4766]: I1209 03:36:18.999498 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ec41cb84-c47e-4199-ac5d-825bbf4f7023","Type":"ContainerStarted","Data":"1c3761e21b8f09276f63e9f5a8e2c9438d29c4a9e0f21e68be9f120fabb3f02e"} Dec 09 03:36:19 crc kubenswrapper[4766]: I1209 03:36:18.999533 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:19 crc kubenswrapper[4766]: I1209 03:36:19.019093 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.019070616 podStartE2EDuration="2.019070616s" podCreationTimestamp="2025-12-09 03:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:36:19.014922445 +0000 UTC m=+1460.724227891" watchObservedRunningTime="2025-12-09 03:36:19.019070616 +0000 UTC m=+1460.728376062" Dec 09 03:36:20 crc kubenswrapper[4766]: I1209 03:36:20.020631 4766 generic.go:334] "Generic (PLEG): container finished" podID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerID="9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087" exitCode=0 Dec 09 03:36:20 crc kubenswrapper[4766]: I1209 03:36:20.020724 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqgtt" event={"ID":"0b8094ea-63d8-4974-94d7-55e33813ed16","Type":"ContainerDied","Data":"9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087"} Dec 09 03:36:21 crc kubenswrapper[4766]: I1209 03:36:21.034455 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqgtt" event={"ID":"0b8094ea-63d8-4974-94d7-55e33813ed16","Type":"ContainerStarted","Data":"416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f"} Dec 09 03:36:21 crc kubenswrapper[4766]: I1209 03:36:21.075891 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dqgtt" podStartSLOduration=2.5640298379999997 podStartE2EDuration="5.075854243s" podCreationTimestamp="2025-12-09 03:36:16 +0000 UTC" firstStartedPulling="2025-12-09 03:36:17.985793476 +0000 UTC m=+1459.695098902" lastFinishedPulling="2025-12-09 03:36:20.497617891 +0000 UTC m=+1462.206923307" observedRunningTime="2025-12-09 03:36:21.064914969 +0000 UTC m=+1462.774220395" watchObservedRunningTime="2025-12-09 03:36:21.075854243 +0000 UTC m=+1462.785159719" Dec 09 03:36:26 crc kubenswrapper[4766]: I1209 03:36:26.639160 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:26 crc kubenswrapper[4766]: I1209 03:36:26.640143 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:27 crc kubenswrapper[4766]: I1209 03:36:27.441965 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 09 03:36:27 crc kubenswrapper[4766]: I1209 03:36:27.702283 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dqgtt" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerName="registry-server" probeResult="failure" output=< Dec 09 03:36:27 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 03:36:27 crc kubenswrapper[4766]: > Dec 09 03:36:27 crc kubenswrapper[4766]: I1209 03:36:27.860327 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lnrlz"] Dec 09 03:36:27 crc kubenswrapper[4766]: I1209 03:36:27.861431 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:27 crc kubenswrapper[4766]: I1209 03:36:27.863440 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 09 03:36:27 crc kubenswrapper[4766]: I1209 03:36:27.863447 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 09 03:36:27 crc kubenswrapper[4766]: I1209 03:36:27.879421 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnrlz"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.013347 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.015010 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.016753 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.034870 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.046024 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wth26\" (UniqueName: \"kubernetes.io/projected/8de191a4-7cf0-4999-a102-b96a06b2ba24-kube-api-access-wth26\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.046065 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.046757 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-config-data\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.046805 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-scripts\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.055085 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.056821 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.068688 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.106709 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.148124 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wth26\" (UniqueName: \"kubernetes.io/projected/8de191a4-7cf0-4999-a102-b96a06b2ba24-kube-api-access-wth26\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.148389 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.148545 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vsbk\" (UniqueName: \"kubernetes.io/projected/527525da-6fc1-4ed4-ab29-017bf44dd58e-kube-api-access-5vsbk\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.148638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.148726 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-config-data\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.148817 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-scripts\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.148919 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-config-data\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.155382 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-scripts\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.156294 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527525da-6fc1-4ed4-ab29-017bf44dd58e-logs\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.164897 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.173485 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.174950 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.180883 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-config-data\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.183065 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.183727 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wth26\" (UniqueName: \"kubernetes.io/projected/8de191a4-7cf0-4999-a102-b96a06b2ba24-kube-api-access-wth26\") pod \"nova-cell0-cell-mapping-lnrlz\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.195361 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.215108 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.239234 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dgqtt"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.241538 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.275767 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxlxh\" (UniqueName: \"kubernetes.io/projected/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-kube-api-access-pxlxh\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.275988 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.276054 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-logs\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.279359 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-497nh\" (UniqueName: \"kubernetes.io/projected/c952df3c-b573-4763-948d-1e73d6d85514-kube-api-access-497nh\") pod \"nova-scheduler-0\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.279646 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vsbk\" (UniqueName: \"kubernetes.io/projected/527525da-6fc1-4ed4-ab29-017bf44dd58e-kube-api-access-5vsbk\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.280484 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.280557 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-config-data\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.280606 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.280658 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-config\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.280725 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-config-data\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.280828 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d29dz\" (UniqueName: \"kubernetes.io/projected/fd933eb5-5f13-4e72-910c-aa495cfae9f3-kube-api-access-d29dz\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.280869 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.281506 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527525da-6fc1-4ed4-ab29-017bf44dd58e-logs\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.283154 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.283199 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.283266 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-config-data\") pod \"nova-scheduler-0\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.283319 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.284786 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527525da-6fc1-4ed4-ab29-017bf44dd58e-logs\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.294328 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-config-data\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.301478 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.321013 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vsbk\" (UniqueName: \"kubernetes.io/projected/527525da-6fc1-4ed4-ab29-017bf44dd58e-kube-api-access-5vsbk\") pod \"nova-api-0\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.335535 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.349268 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.360192 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.362664 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.371828 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dgqtt"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.385763 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.385812 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.385834 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-config-data\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.385862 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-config\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.385896 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f9ms\" (UniqueName: \"kubernetes.io/projected/765c23d8-f967-4624-8a2e-82ecc9788177-kube-api-access-9f9ms\") pod \"nova-cell1-novncproxy-0\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.385918 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.385941 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d29dz\" (UniqueName: \"kubernetes.io/projected/fd933eb5-5f13-4e72-910c-aa495cfae9f3-kube-api-access-d29dz\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.385956 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.385992 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.386008 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.386030 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-config-data\") pod \"nova-scheduler-0\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.386052 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.386070 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlxh\" (UniqueName: \"kubernetes.io/projected/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-kube-api-access-pxlxh\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.386088 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.386110 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-logs\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.386141 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-497nh\" (UniqueName: \"kubernetes.io/projected/c952df3c-b573-4763-948d-1e73d6d85514-kube-api-access-497nh\") pod \"nova-scheduler-0\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.389313 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.389841 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.390443 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-config\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.390605 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.392163 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-logs\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.392526 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.399600 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-config-data\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.404190 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-config-data\") pod \"nova-scheduler-0\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.409849 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.410678 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.412507 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.419498 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxlxh\" (UniqueName: \"kubernetes.io/projected/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-kube-api-access-pxlxh\") pod \"nova-metadata-0\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.437581 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-497nh\" (UniqueName: \"kubernetes.io/projected/c952df3c-b573-4763-948d-1e73d6d85514-kube-api-access-497nh\") pod \"nova-scheduler-0\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.438825 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d29dz\" (UniqueName: \"kubernetes.io/projected/fd933eb5-5f13-4e72-910c-aa495cfae9f3-kube-api-access-d29dz\") pod \"dnsmasq-dns-845d6d6f59-dgqtt\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.487536 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.487632 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f9ms\" (UniqueName: \"kubernetes.io/projected/765c23d8-f967-4624-8a2e-82ecc9788177-kube-api-access-9f9ms\") pod \"nova-cell1-novncproxy-0\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.487663 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.495703 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.506292 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.512230 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f9ms\" (UniqueName: \"kubernetes.io/projected/765c23d8-f967-4624-8a2e-82ecc9788177-kube-api-access-9f9ms\") pod \"nova-cell1-novncproxy-0\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.708399 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.708911 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.717934 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.735857 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.917747 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnrlz"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.991861 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7kpv"] Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.993958 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.997146 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 03:36:28 crc kubenswrapper[4766]: I1209 03:36:28.999004 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.014298 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7kpv"] Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.048726 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.098723 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-scripts\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.098788 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.098832 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-config-data\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.098858 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zfw\" (UniqueName: \"kubernetes.io/projected/e4623229-9cd3-4c95-bba1-1c202cb8f07c-kube-api-access-t2zfw\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.125769 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"527525da-6fc1-4ed4-ab29-017bf44dd58e","Type":"ContainerStarted","Data":"976a6189b31cce0c7a8946982bee72faf96a800bdce19aaf5f9f3d02cba0e3e4"} Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.127254 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnrlz" event={"ID":"8de191a4-7cf0-4999-a102-b96a06b2ba24","Type":"ContainerStarted","Data":"119aa14fd95227770d218be27373b7af457db0162daa7a8760de0a8d35f43f3c"} Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.200521 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-scripts\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.200585 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.200615 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-config-data\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.200638 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zfw\" (UniqueName: \"kubernetes.io/projected/e4623229-9cd3-4c95-bba1-1c202cb8f07c-kube-api-access-t2zfw\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.208643 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-config-data\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.210499 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.223769 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-scripts\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.225187 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zfw\" (UniqueName: \"kubernetes.io/projected/e4623229-9cd3-4c95-bba1-1c202cb8f07c-kube-api-access-t2zfw\") pod \"nova-cell1-conductor-db-sync-c7kpv\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.347206 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.353942 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:36:29 crc kubenswrapper[4766]: W1209 03:36:29.370821 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod765c23d8_f967_4624_8a2e_82ecc9788177.slice/crio-576f4341c24a60f4bc27348fca37944e064d32ae2e06e3223928ba2911af1dae WatchSource:0}: Error finding container 576f4341c24a60f4bc27348fca37944e064d32ae2e06e3223928ba2911af1dae: Status 404 returned error can't find the container with id 576f4341c24a60f4bc27348fca37944e064d32ae2e06e3223928ba2911af1dae Dec 09 03:36:29 crc kubenswrapper[4766]: W1209 03:36:29.379388 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc952df3c_b573_4763_948d_1e73d6d85514.slice/crio-d592fc87f935ffa0f7ca9a56cd1689e5eca8ad64a55e9fb87e6e4ba639cc096e WatchSource:0}: Error finding container d592fc87f935ffa0f7ca9a56cd1689e5eca8ad64a55e9fb87e6e4ba639cc096e: Status 404 returned error can't find the container with id d592fc87f935ffa0f7ca9a56cd1689e5eca8ad64a55e9fb87e6e4ba639cc096e Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.395433 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.407427 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dgqtt"] Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.539751 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:29 crc kubenswrapper[4766]: I1209 03:36:29.933405 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7kpv"] Dec 09 03:36:29 crc kubenswrapper[4766]: W1209 03:36:29.934802 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4623229_9cd3_4c95_bba1_1c202cb8f07c.slice/crio-e56aca4f9e087a36dbe7c807755386fd3d6db364451b426f3e8738d22827569b WatchSource:0}: Error finding container e56aca4f9e087a36dbe7c807755386fd3d6db364451b426f3e8738d22827569b: Status 404 returned error can't find the container with id e56aca4f9e087a36dbe7c807755386fd3d6db364451b426f3e8738d22827569b Dec 09 03:36:30 crc kubenswrapper[4766]: I1209 03:36:30.205773 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c7kpv" event={"ID":"e4623229-9cd3-4c95-bba1-1c202cb8f07c","Type":"ContainerStarted","Data":"e56aca4f9e087a36dbe7c807755386fd3d6db364451b426f3e8738d22827569b"} Dec 09 03:36:30 crc kubenswrapper[4766]: I1209 03:36:30.248397 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnrlz" event={"ID":"8de191a4-7cf0-4999-a102-b96a06b2ba24","Type":"ContainerStarted","Data":"95f400ae3bc5fed8d76ed756e9cc0580f2a32e514a76972f4091c1cce5e175e5"} Dec 09 03:36:30 crc kubenswrapper[4766]: I1209 03:36:30.277500 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"765c23d8-f967-4624-8a2e-82ecc9788177","Type":"ContainerStarted","Data":"576f4341c24a60f4bc27348fca37944e064d32ae2e06e3223928ba2911af1dae"} Dec 09 03:36:30 crc kubenswrapper[4766]: I1209 03:36:30.309831 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lnrlz" podStartSLOduration=3.309807869 podStartE2EDuration="3.309807869s" podCreationTimestamp="2025-12-09 03:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:36:30.276674118 +0000 UTC m=+1471.985979554" watchObservedRunningTime="2025-12-09 03:36:30.309807869 +0000 UTC m=+1472.019113295" Dec 09 03:36:30 crc kubenswrapper[4766]: I1209 03:36:30.318684 4766 generic.go:334] "Generic (PLEG): container finished" podID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" containerID="ca2afbaa30f2f394604235f5720c5aa4e72406b67d1ca341aaee0ebf94179748" exitCode=0 Dec 09 03:36:30 crc kubenswrapper[4766]: I1209 03:36:30.318763 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" event={"ID":"fd933eb5-5f13-4e72-910c-aa495cfae9f3","Type":"ContainerDied","Data":"ca2afbaa30f2f394604235f5720c5aa4e72406b67d1ca341aaee0ebf94179748"} Dec 09 03:36:30 crc kubenswrapper[4766]: I1209 03:36:30.318789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" event={"ID":"fd933eb5-5f13-4e72-910c-aa495cfae9f3","Type":"ContainerStarted","Data":"9615a2ea011fcb06065675ed4612ac1831a66023d62abf069c5d66631ad5e92f"} Dec 09 03:36:30 crc kubenswrapper[4766]: I1209 03:36:30.329198 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3821b38e-9fb2-45eb-a600-dcd5a25bcac9","Type":"ContainerStarted","Data":"4679dc5bd42e72afb2c6886579ff859bb9957186832aff154b8077ed42904a8a"} Dec 09 03:36:30 crc kubenswrapper[4766]: I1209 03:36:30.344606 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c952df3c-b573-4763-948d-1e73d6d85514","Type":"ContainerStarted","Data":"d592fc87f935ffa0f7ca9a56cd1689e5eca8ad64a55e9fb87e6e4ba639cc096e"} Dec 09 03:36:31 crc kubenswrapper[4766]: I1209 03:36:31.360091 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" event={"ID":"fd933eb5-5f13-4e72-910c-aa495cfae9f3","Type":"ContainerStarted","Data":"ce3ccf75f0cf375ecd11a04f0e8f5e4eaab059c87c848791d6110f0f4ae70604"} Dec 09 03:36:31 crc kubenswrapper[4766]: I1209 03:36:31.361172 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:31 crc kubenswrapper[4766]: I1209 03:36:31.362407 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c7kpv" event={"ID":"e4623229-9cd3-4c95-bba1-1c202cb8f07c","Type":"ContainerStarted","Data":"c195a4bf15fe20f9423e97eb1ce60f9483c3da4af5de80348bebc92f5f6abde2"} Dec 09 03:36:31 crc kubenswrapper[4766]: I1209 03:36:31.387018 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" podStartSLOduration=3.3870017519999998 podStartE2EDuration="3.387001752s" podCreationTimestamp="2025-12-09 03:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:36:31.380522888 +0000 UTC m=+1473.089828314" watchObservedRunningTime="2025-12-09 03:36:31.387001752 +0000 UTC m=+1473.096307178" Dec 09 03:36:31 crc kubenswrapper[4766]: I1209 03:36:31.406664 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-c7kpv" podStartSLOduration=3.40664618 podStartE2EDuration="3.40664618s" podCreationTimestamp="2025-12-09 03:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:36:31.398617944 +0000 UTC m=+1473.107923380" watchObservedRunningTime="2025-12-09 03:36:31.40664618 +0000 UTC m=+1473.115951606" Dec 09 03:36:31 crc kubenswrapper[4766]: I1209 03:36:31.844121 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:31 crc kubenswrapper[4766]: I1209 03:36:31.886169 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.415719 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c952df3c-b573-4763-948d-1e73d6d85514","Type":"ContainerStarted","Data":"75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8"} Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.420229 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"527525da-6fc1-4ed4-ab29-017bf44dd58e","Type":"ContainerStarted","Data":"f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5"} Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.420271 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"527525da-6fc1-4ed4-ab29-017bf44dd58e","Type":"ContainerStarted","Data":"f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e"} Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.422150 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"765c23d8-f967-4624-8a2e-82ecc9788177","Type":"ContainerStarted","Data":"c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744"} Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.422298 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="765c23d8-f967-4624-8a2e-82ecc9788177" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744" gracePeriod=30 Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.424462 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3821b38e-9fb2-45eb-a600-dcd5a25bcac9","Type":"ContainerStarted","Data":"f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7"} Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.424494 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3821b38e-9fb2-45eb-a600-dcd5a25bcac9","Type":"ContainerStarted","Data":"1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c"} Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.424590 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerName="nova-metadata-log" containerID="cri-o://1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c" gracePeriod=30 Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.424678 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerName="nova-metadata-metadata" containerID="cri-o://f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7" gracePeriod=30 Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.435852 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.242751784 podStartE2EDuration="6.435826609s" podCreationTimestamp="2025-12-09 03:36:28 +0000 UTC" firstStartedPulling="2025-12-09 03:36:29.385326307 +0000 UTC m=+1471.094631733" lastFinishedPulling="2025-12-09 03:36:33.578401122 +0000 UTC m=+1475.287706558" observedRunningTime="2025-12-09 03:36:34.433233459 +0000 UTC m=+1476.142538905" watchObservedRunningTime="2025-12-09 03:36:34.435826609 +0000 UTC m=+1476.145132055" Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.456148 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.2480014170000002 podStartE2EDuration="6.456126476s" podCreationTimestamp="2025-12-09 03:36:28 +0000 UTC" firstStartedPulling="2025-12-09 03:36:29.381523685 +0000 UTC m=+1471.090829111" lastFinishedPulling="2025-12-09 03:36:33.589648734 +0000 UTC m=+1475.298954170" observedRunningTime="2025-12-09 03:36:34.449793754 +0000 UTC m=+1476.159099180" watchObservedRunningTime="2025-12-09 03:36:34.456126476 +0000 UTC m=+1476.165431902" Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.491301 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.963377944 podStartE2EDuration="7.491282081s" podCreationTimestamp="2025-12-09 03:36:27 +0000 UTC" firstStartedPulling="2025-12-09 03:36:29.071625584 +0000 UTC m=+1470.780931010" lastFinishedPulling="2025-12-09 03:36:33.599529701 +0000 UTC m=+1475.308835147" observedRunningTime="2025-12-09 03:36:34.490074689 +0000 UTC m=+1476.199380115" watchObservedRunningTime="2025-12-09 03:36:34.491282081 +0000 UTC m=+1476.200587497" Dec 09 03:36:34 crc kubenswrapper[4766]: I1209 03:36:34.492561 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.468777038 podStartE2EDuration="6.492555256s" podCreationTimestamp="2025-12-09 03:36:28 +0000 UTC" firstStartedPulling="2025-12-09 03:36:29.564776237 +0000 UTC m=+1471.274081653" lastFinishedPulling="2025-12-09 03:36:33.588554435 +0000 UTC m=+1475.297859871" observedRunningTime="2025-12-09 03:36:34.475585728 +0000 UTC m=+1476.184891154" watchObservedRunningTime="2025-12-09 03:36:34.492555256 +0000 UTC m=+1476.201860682" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.015049 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.063034 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-config-data\") pod \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.063318 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-logs\") pod \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.063392 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-combined-ca-bundle\") pod \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.063426 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxlxh\" (UniqueName: \"kubernetes.io/projected/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-kube-api-access-pxlxh\") pod \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\" (UID: \"3821b38e-9fb2-45eb-a600-dcd5a25bcac9\") " Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.063900 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-logs" (OuterVolumeSpecName: "logs") pod "3821b38e-9fb2-45eb-a600-dcd5a25bcac9" (UID: "3821b38e-9fb2-45eb-a600-dcd5a25bcac9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.071362 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-kube-api-access-pxlxh" (OuterVolumeSpecName: "kube-api-access-pxlxh") pod "3821b38e-9fb2-45eb-a600-dcd5a25bcac9" (UID: "3821b38e-9fb2-45eb-a600-dcd5a25bcac9"). InnerVolumeSpecName "kube-api-access-pxlxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.104696 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-config-data" (OuterVolumeSpecName: "config-data") pod "3821b38e-9fb2-45eb-a600-dcd5a25bcac9" (UID: "3821b38e-9fb2-45eb-a600-dcd5a25bcac9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.105350 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3821b38e-9fb2-45eb-a600-dcd5a25bcac9" (UID: "3821b38e-9fb2-45eb-a600-dcd5a25bcac9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.166280 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.166315 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxlxh\" (UniqueName: \"kubernetes.io/projected/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-kube-api-access-pxlxh\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.166327 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.166335 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3821b38e-9fb2-45eb-a600-dcd5a25bcac9-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.434661 4766 generic.go:334] "Generic (PLEG): container finished" podID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerID="f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7" exitCode=0 Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.434715 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3821b38e-9fb2-45eb-a600-dcd5a25bcac9","Type":"ContainerDied","Data":"f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7"} Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.434752 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3821b38e-9fb2-45eb-a600-dcd5a25bcac9","Type":"ContainerDied","Data":"1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c"} Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.434701 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.434726 4766 generic.go:334] "Generic (PLEG): container finished" podID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerID="1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c" exitCode=143 Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.434840 4766 scope.go:117] "RemoveContainer" containerID="f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.435062 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3821b38e-9fb2-45eb-a600-dcd5a25bcac9","Type":"ContainerDied","Data":"4679dc5bd42e72afb2c6886579ff859bb9957186832aff154b8077ed42904a8a"} Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.465060 4766 scope.go:117] "RemoveContainer" containerID="1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.476134 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.485638 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.500544 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:35 crc kubenswrapper[4766]: E1209 03:36:35.500908 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerName="nova-metadata-metadata" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.500925 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerName="nova-metadata-metadata" Dec 09 03:36:35 crc kubenswrapper[4766]: E1209 03:36:35.500943 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerName="nova-metadata-log" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.500949 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerName="nova-metadata-log" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.501113 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerName="nova-metadata-metadata" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.501130 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" containerName="nova-metadata-log" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.502075 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.504593 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.505090 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.507255 4766 scope.go:117] "RemoveContainer" containerID="f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7" Dec 09 03:36:35 crc kubenswrapper[4766]: E1209 03:36:35.507811 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7\": container with ID starting with f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7 not found: ID does not exist" containerID="f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.507848 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7"} err="failed to get container status \"f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7\": rpc error: code = NotFound desc = could not find container \"f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7\": container with ID starting with f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7 not found: ID does not exist" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.507872 4766 scope.go:117] "RemoveContainer" containerID="1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c" Dec 09 03:36:35 crc kubenswrapper[4766]: E1209 03:36:35.508335 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c\": container with ID starting with 1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c not found: ID does not exist" containerID="1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.508368 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c"} err="failed to get container status \"1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c\": rpc error: code = NotFound desc = could not find container \"1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c\": container with ID starting with 1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c not found: ID does not exist" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.508395 4766 scope.go:117] "RemoveContainer" containerID="f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.508886 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7"} err="failed to get container status \"f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7\": rpc error: code = NotFound desc = could not find container \"f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7\": container with ID starting with f377d04368dae82206847ab8d07be26babe7068f5612a3201f85e472d14126d7 not found: ID does not exist" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.508926 4766 scope.go:117] "RemoveContainer" containerID="1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.509284 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c"} err="failed to get container status \"1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c\": rpc error: code = NotFound desc = could not find container \"1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c\": container with ID starting with 1af5501daa7930e7912593b947025b847cd6aab38fbabaa811253f46b020e42c not found: ID does not exist" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.519382 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.574980 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.575390 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-config-data\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.575577 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrw6m\" (UniqueName: \"kubernetes.io/projected/848cf936-5ff0-4429-b4d4-5d0864e95cfc-kube-api-access-qrw6m\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.575774 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.575878 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/848cf936-5ff0-4429-b4d4-5d0864e95cfc-logs\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.677540 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.677631 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-config-data\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.677679 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrw6m\" (UniqueName: \"kubernetes.io/projected/848cf936-5ff0-4429-b4d4-5d0864e95cfc-kube-api-access-qrw6m\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.677703 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.677730 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/848cf936-5ff0-4429-b4d4-5d0864e95cfc-logs\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.678148 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/848cf936-5ff0-4429-b4d4-5d0864e95cfc-logs\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.681301 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.682763 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-config-data\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.683730 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.710721 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrw6m\" (UniqueName: \"kubernetes.io/projected/848cf936-5ff0-4429-b4d4-5d0864e95cfc-kube-api-access-qrw6m\") pod \"nova-metadata-0\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " pod="openstack/nova-metadata-0" Dec 09 03:36:35 crc kubenswrapper[4766]: I1209 03:36:35.822077 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:36 crc kubenswrapper[4766]: I1209 03:36:36.296014 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 03:36:36 crc kubenswrapper[4766]: W1209 03:36:36.311576 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod848cf936_5ff0_4429_b4d4_5d0864e95cfc.slice/crio-1fafc5e492ba74cfd91f2b4e5bb977d3037301b92c020ef06e0a2d1600aa3c91 WatchSource:0}: Error finding container 1fafc5e492ba74cfd91f2b4e5bb977d3037301b92c020ef06e0a2d1600aa3c91: Status 404 returned error can't find the container with id 1fafc5e492ba74cfd91f2b4e5bb977d3037301b92c020ef06e0a2d1600aa3c91 Dec 09 03:36:36 crc kubenswrapper[4766]: I1209 03:36:36.327803 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:36 crc kubenswrapper[4766]: I1209 03:36:36.455508 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"848cf936-5ff0-4429-b4d4-5d0864e95cfc","Type":"ContainerStarted","Data":"1fafc5e492ba74cfd91f2b4e5bb977d3037301b92c020ef06e0a2d1600aa3c91"} Dec 09 03:36:36 crc kubenswrapper[4766]: I1209 03:36:36.712741 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:36 crc kubenswrapper[4766]: I1209 03:36:36.792657 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:36 crc kubenswrapper[4766]: I1209 03:36:36.849794 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3821b38e-9fb2-45eb-a600-dcd5a25bcac9" path="/var/lib/kubelet/pods/3821b38e-9fb2-45eb-a600-dcd5a25bcac9/volumes" Dec 09 03:36:36 crc kubenswrapper[4766]: I1209 03:36:36.946838 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dqgtt"] Dec 09 03:36:37 crc kubenswrapper[4766]: I1209 03:36:37.316257 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:36:37 crc kubenswrapper[4766]: I1209 03:36:37.316516 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:36:37 crc kubenswrapper[4766]: I1209 03:36:37.473928 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"848cf936-5ff0-4429-b4d4-5d0864e95cfc","Type":"ContainerStarted","Data":"6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba"} Dec 09 03:36:37 crc kubenswrapper[4766]: I1209 03:36:37.474287 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"848cf936-5ff0-4429-b4d4-5d0864e95cfc","Type":"ContainerStarted","Data":"0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037"} Dec 09 03:36:37 crc kubenswrapper[4766]: I1209 03:36:37.477247 4766 generic.go:334] "Generic (PLEG): container finished" podID="8de191a4-7cf0-4999-a102-b96a06b2ba24" containerID="95f400ae3bc5fed8d76ed756e9cc0580f2a32e514a76972f4091c1cce5e175e5" exitCode=0 Dec 09 03:36:37 crc kubenswrapper[4766]: I1209 03:36:37.477330 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnrlz" event={"ID":"8de191a4-7cf0-4999-a102-b96a06b2ba24","Type":"ContainerDied","Data":"95f400ae3bc5fed8d76ed756e9cc0580f2a32e514a76972f4091c1cce5e175e5"} Dec 09 03:36:37 crc kubenswrapper[4766]: I1209 03:36:37.547129 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.547108497 podStartE2EDuration="2.547108497s" podCreationTimestamp="2025-12-09 03:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:36:37.514878389 +0000 UTC m=+1479.224183825" watchObservedRunningTime="2025-12-09 03:36:37.547108497 +0000 UTC m=+1479.256413943" Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.336328 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.336388 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.488823 4766 generic.go:334] "Generic (PLEG): container finished" podID="e4623229-9cd3-4c95-bba1-1c202cb8f07c" containerID="c195a4bf15fe20f9423e97eb1ce60f9483c3da4af5de80348bebc92f5f6abde2" exitCode=0 Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.489072 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dqgtt" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerName="registry-server" containerID="cri-o://416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f" gracePeriod=2 Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.489144 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c7kpv" event={"ID":"e4623229-9cd3-4c95-bba1-1c202cb8f07c","Type":"ContainerDied","Data":"c195a4bf15fe20f9423e97eb1ce60f9483c3da4af5de80348bebc92f5f6abde2"} Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.708954 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.709320 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.719537 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.736640 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.765859 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.822879 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ns6dc"] Dec 09 03:36:38 crc kubenswrapper[4766]: I1209 03:36:38.823126 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" podUID="5ee1455b-319a-4093-85ba-0b97e662ecf8" containerName="dnsmasq-dns" containerID="cri-o://37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f" gracePeriod=10 Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.083857 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.094115 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.184817 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-scripts\") pod \"8de191a4-7cf0-4999-a102-b96a06b2ba24\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.185421 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wth26\" (UniqueName: \"kubernetes.io/projected/8de191a4-7cf0-4999-a102-b96a06b2ba24-kube-api-access-wth26\") pod \"8de191a4-7cf0-4999-a102-b96a06b2ba24\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.185497 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-combined-ca-bundle\") pod \"8de191a4-7cf0-4999-a102-b96a06b2ba24\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.185540 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-config-data\") pod \"8de191a4-7cf0-4999-a102-b96a06b2ba24\" (UID: \"8de191a4-7cf0-4999-a102-b96a06b2ba24\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.193639 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-scripts" (OuterVolumeSpecName: "scripts") pod "8de191a4-7cf0-4999-a102-b96a06b2ba24" (UID: "8de191a4-7cf0-4999-a102-b96a06b2ba24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.199182 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de191a4-7cf0-4999-a102-b96a06b2ba24-kube-api-access-wth26" (OuterVolumeSpecName: "kube-api-access-wth26") pod "8de191a4-7cf0-4999-a102-b96a06b2ba24" (UID: "8de191a4-7cf0-4999-a102-b96a06b2ba24"). InnerVolumeSpecName "kube-api-access-wth26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.233404 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8de191a4-7cf0-4999-a102-b96a06b2ba24" (UID: "8de191a4-7cf0-4999-a102-b96a06b2ba24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.260800 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-config-data" (OuterVolumeSpecName: "config-data") pod "8de191a4-7cf0-4999-a102-b96a06b2ba24" (UID: "8de191a4-7cf0-4999-a102-b96a06b2ba24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.286858 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gvfm\" (UniqueName: \"kubernetes.io/projected/0b8094ea-63d8-4974-94d7-55e33813ed16-kube-api-access-9gvfm\") pod \"0b8094ea-63d8-4974-94d7-55e33813ed16\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.286966 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-catalog-content\") pod \"0b8094ea-63d8-4974-94d7-55e33813ed16\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.287042 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-utilities\") pod \"0b8094ea-63d8-4974-94d7-55e33813ed16\" (UID: \"0b8094ea-63d8-4974-94d7-55e33813ed16\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.287532 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.287551 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.287560 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de191a4-7cf0-4999-a102-b96a06b2ba24-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.287568 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wth26\" (UniqueName: \"kubernetes.io/projected/8de191a4-7cf0-4999-a102-b96a06b2ba24-kube-api-access-wth26\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.288432 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-utilities" (OuterVolumeSpecName: "utilities") pod "0b8094ea-63d8-4974-94d7-55e33813ed16" (UID: "0b8094ea-63d8-4974-94d7-55e33813ed16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.297684 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8094ea-63d8-4974-94d7-55e33813ed16-kube-api-access-9gvfm" (OuterVolumeSpecName: "kube-api-access-9gvfm") pod "0b8094ea-63d8-4974-94d7-55e33813ed16" (UID: "0b8094ea-63d8-4974-94d7-55e33813ed16"). InnerVolumeSpecName "kube-api-access-9gvfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.344643 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.396569 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gvfm\" (UniqueName: \"kubernetes.io/projected/0b8094ea-63d8-4974-94d7-55e33813ed16-kube-api-access-9gvfm\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.397482 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.421908 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.422443 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.467021 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b8094ea-63d8-4974-94d7-55e33813ed16" (UID: "0b8094ea-63d8-4974-94d7-55e33813ed16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.500113 4766 generic.go:334] "Generic (PLEG): container finished" podID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerID="416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f" exitCode=0 Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.500173 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqgtt" event={"ID":"0b8094ea-63d8-4974-94d7-55e33813ed16","Type":"ContainerDied","Data":"416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f"} Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.500203 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dqgtt" event={"ID":"0b8094ea-63d8-4974-94d7-55e33813ed16","Type":"ContainerDied","Data":"26695224dda7e6e041746b65f8447547db567ad9ae2501f15892369daf687825"} Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.500240 4766 scope.go:117] "RemoveContainer" containerID="416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.500389 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dqgtt" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.511910 4766 generic.go:334] "Generic (PLEG): container finished" podID="5ee1455b-319a-4093-85ba-0b97e662ecf8" containerID="37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f" exitCode=0 Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.512091 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.512671 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" event={"ID":"5ee1455b-319a-4093-85ba-0b97e662ecf8","Type":"ContainerDied","Data":"37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f"} Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.512710 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ns6dc" event={"ID":"5ee1455b-319a-4093-85ba-0b97e662ecf8","Type":"ContainerDied","Data":"dfe098adacb25ec4d068534d1058cf434692b493fba842b9e9c9d55845e39a8c"} Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.515604 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-nb\") pod \"5ee1455b-319a-4093-85ba-0b97e662ecf8\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.515676 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcfnb\" (UniqueName: \"kubernetes.io/projected/5ee1455b-319a-4093-85ba-0b97e662ecf8-kube-api-access-xcfnb\") pod \"5ee1455b-319a-4093-85ba-0b97e662ecf8\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.515705 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-sb\") pod \"5ee1455b-319a-4093-85ba-0b97e662ecf8\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.515737 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-config\") pod \"5ee1455b-319a-4093-85ba-0b97e662ecf8\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.515779 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-swift-storage-0\") pod \"5ee1455b-319a-4093-85ba-0b97e662ecf8\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.515849 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-svc\") pod \"5ee1455b-319a-4093-85ba-0b97e662ecf8\" (UID: \"5ee1455b-319a-4093-85ba-0b97e662ecf8\") " Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.516260 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b8094ea-63d8-4974-94d7-55e33813ed16-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.526594 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee1455b-319a-4093-85ba-0b97e662ecf8-kube-api-access-xcfnb" (OuterVolumeSpecName: "kube-api-access-xcfnb") pod "5ee1455b-319a-4093-85ba-0b97e662ecf8" (UID: "5ee1455b-319a-4093-85ba-0b97e662ecf8"). InnerVolumeSpecName "kube-api-access-xcfnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.529321 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lnrlz" event={"ID":"8de191a4-7cf0-4999-a102-b96a06b2ba24","Type":"ContainerDied","Data":"119aa14fd95227770d218be27373b7af457db0162daa7a8760de0a8d35f43f3c"} Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.529358 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119aa14fd95227770d218be27373b7af457db0162daa7a8760de0a8d35f43f3c" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.529369 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lnrlz" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.569874 4766 scope.go:117] "RemoveContainer" containerID="9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.588956 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dqgtt"] Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.604953 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5ee1455b-319a-4093-85ba-0b97e662ecf8" (UID: "5ee1455b-319a-4093-85ba-0b97e662ecf8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.618071 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcfnb\" (UniqueName: \"kubernetes.io/projected/5ee1455b-319a-4093-85ba-0b97e662ecf8-kube-api-access-xcfnb\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.618119 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.619745 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-config" (OuterVolumeSpecName: "config") pod "5ee1455b-319a-4093-85ba-0b97e662ecf8" (UID: "5ee1455b-319a-4093-85ba-0b97e662ecf8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.621506 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dqgtt"] Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.651431 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.663021 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ee1455b-319a-4093-85ba-0b97e662ecf8" (UID: "5ee1455b-319a-4093-85ba-0b97e662ecf8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.672567 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ee1455b-319a-4093-85ba-0b97e662ecf8" (UID: "5ee1455b-319a-4093-85ba-0b97e662ecf8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.702877 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ee1455b-319a-4093-85ba-0b97e662ecf8" (UID: "5ee1455b-319a-4093-85ba-0b97e662ecf8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.704331 4766 scope.go:117] "RemoveContainer" containerID="aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.720012 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.720040 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.720049 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.720057 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee1455b-319a-4093-85ba-0b97e662ecf8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.784058 4766 scope.go:117] "RemoveContainer" containerID="416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f" Dec 09 03:36:39 crc kubenswrapper[4766]: E1209 03:36:39.784546 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f\": container with ID starting with 416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f not found: ID does not exist" containerID="416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.784579 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f"} err="failed to get container status \"416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f\": rpc error: code = NotFound desc = could not find container \"416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f\": container with ID starting with 416f413b3970f918c47ff606f6502f7e7951a0c06cb0df05473cf1d3d934201f not found: ID does not exist" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.784599 4766 scope.go:117] "RemoveContainer" containerID="9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087" Dec 09 03:36:39 crc kubenswrapper[4766]: E1209 03:36:39.784877 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087\": container with ID starting with 9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087 not found: ID does not exist" containerID="9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.784912 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087"} err="failed to get container status \"9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087\": rpc error: code = NotFound desc = could not find container \"9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087\": container with ID starting with 9343ad0c2ae4a40747964fd96c09e16e89d01b0f64582169abad95ee18ff6087 not found: ID does not exist" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.784935 4766 scope.go:117] "RemoveContainer" containerID="aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8" Dec 09 03:36:39 crc kubenswrapper[4766]: E1209 03:36:39.785330 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8\": container with ID starting with aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8 not found: ID does not exist" containerID="aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.785355 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8"} err="failed to get container status \"aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8\": rpc error: code = NotFound desc = could not find container \"aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8\": container with ID starting with aee1719d9a05433a0d36c0a82facb938dd85d83f39fc7d33386cb3fa346844c8 not found: ID does not exist" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.785368 4766 scope.go:117] "RemoveContainer" containerID="37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.803342 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.803550 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-log" containerID="cri-o://f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e" gracePeriod=30 Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.803973 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-api" containerID="cri-o://f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5" gracePeriod=30 Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.852540 4766 scope.go:117] "RemoveContainer" containerID="5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.892390 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ns6dc"] Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.904717 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ns6dc"] Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.916018 4766 scope.go:117] "RemoveContainer" containerID="37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f" Dec 09 03:36:39 crc kubenswrapper[4766]: E1209 03:36:39.925372 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f\": container with ID starting with 37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f not found: ID does not exist" containerID="37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.925412 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f"} err="failed to get container status \"37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f\": rpc error: code = NotFound desc = could not find container \"37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f\": container with ID starting with 37c26f249e195c4d1bd0eedac630bb96e7d0ae5afd99127e8a75df36cb19175f not found: ID does not exist" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.925438 4766 scope.go:117] "RemoveContainer" containerID="5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7" Dec 09 03:36:39 crc kubenswrapper[4766]: E1209 03:36:39.928803 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7\": container with ID starting with 5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7 not found: ID does not exist" containerID="5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.928850 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7"} err="failed to get container status \"5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7\": rpc error: code = NotFound desc = could not find container \"5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7\": container with ID starting with 5715e037eb1e46847048d09be88897485936abec117829689fa825eda8ae71b7 not found: ID does not exist" Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.996665 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.996872 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerName="nova-metadata-log" containerID="cri-o://0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037" gracePeriod=30 Dec 09 03:36:39 crc kubenswrapper[4766]: I1209 03:36:39.997000 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerName="nova-metadata-metadata" containerID="cri-o://6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba" gracePeriod=30 Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.055042 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.130362 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-config-data\") pod \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.130417 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-scripts\") pod \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.130502 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2zfw\" (UniqueName: \"kubernetes.io/projected/e4623229-9cd3-4c95-bba1-1c202cb8f07c-kube-api-access-t2zfw\") pod \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.130555 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-combined-ca-bundle\") pod \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\" (UID: \"e4623229-9cd3-4c95-bba1-1c202cb8f07c\") " Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.136481 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-scripts" (OuterVolumeSpecName: "scripts") pod "e4623229-9cd3-4c95-bba1-1c202cb8f07c" (UID: "e4623229-9cd3-4c95-bba1-1c202cb8f07c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.137431 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4623229-9cd3-4c95-bba1-1c202cb8f07c-kube-api-access-t2zfw" (OuterVolumeSpecName: "kube-api-access-t2zfw") pod "e4623229-9cd3-4c95-bba1-1c202cb8f07c" (UID: "e4623229-9cd3-4c95-bba1-1c202cb8f07c"). InnerVolumeSpecName "kube-api-access-t2zfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.168395 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-config-data" (OuterVolumeSpecName: "config-data") pod "e4623229-9cd3-4c95-bba1-1c202cb8f07c" (UID: "e4623229-9cd3-4c95-bba1-1c202cb8f07c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.196383 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4623229-9cd3-4c95-bba1-1c202cb8f07c" (UID: "e4623229-9cd3-4c95-bba1-1c202cb8f07c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.232185 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.232255 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.232267 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2zfw\" (UniqueName: \"kubernetes.io/projected/e4623229-9cd3-4c95-bba1-1c202cb8f07c-kube-api-access-t2zfw\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.232281 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4623229-9cd3-4c95-bba1-1c202cb8f07c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.485454 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.485674 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.550851 4766 generic.go:334] "Generic (PLEG): container finished" podID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerID="6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba" exitCode=0 Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.550888 4766 generic.go:334] "Generic (PLEG): container finished" podID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerID="0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037" exitCode=143 Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.550899 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.550952 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"848cf936-5ff0-4429-b4d4-5d0864e95cfc","Type":"ContainerDied","Data":"6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba"} Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.550982 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"848cf936-5ff0-4429-b4d4-5d0864e95cfc","Type":"ContainerDied","Data":"0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037"} Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.550998 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"848cf936-5ff0-4429-b4d4-5d0864e95cfc","Type":"ContainerDied","Data":"1fafc5e492ba74cfd91f2b4e5bb977d3037301b92c020ef06e0a2d1600aa3c91"} Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.551017 4766 scope.go:117] "RemoveContainer" containerID="6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.570339 4766 generic.go:334] "Generic (PLEG): container finished" podID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerID="f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e" exitCode=143 Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.570412 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"527525da-6fc1-4ed4-ab29-017bf44dd58e","Type":"ContainerDied","Data":"f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e"} Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.572686 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c7kpv" event={"ID":"e4623229-9cd3-4c95-bba1-1c202cb8f07c","Type":"ContainerDied","Data":"e56aca4f9e087a36dbe7c807755386fd3d6db364451b426f3e8738d22827569b"} Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.572778 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e56aca4f9e087a36dbe7c807755386fd3d6db364451b426f3e8738d22827569b" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.572723 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c7kpv" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.611419 4766 scope.go:117] "RemoveContainer" containerID="0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.640539 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-nova-metadata-tls-certs\") pod \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.640606 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrw6m\" (UniqueName: \"kubernetes.io/projected/848cf936-5ff0-4429-b4d4-5d0864e95cfc-kube-api-access-qrw6m\") pod \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.640642 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/848cf936-5ff0-4429-b4d4-5d0864e95cfc-logs\") pod \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.640691 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-combined-ca-bundle\") pod \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.640722 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-config-data\") pod \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\" (UID: \"848cf936-5ff0-4429-b4d4-5d0864e95cfc\") " Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.643508 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848cf936-5ff0-4429-b4d4-5d0864e95cfc-logs" (OuterVolumeSpecName: "logs") pod "848cf936-5ff0-4429-b4d4-5d0864e95cfc" (UID: "848cf936-5ff0-4429-b4d4-5d0864e95cfc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.646437 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.646891 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerName="extract-content" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.646913 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerName="extract-content" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.646931 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerName="nova-metadata-log" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.646941 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerName="nova-metadata-log" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.646954 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerName="nova-metadata-metadata" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.646961 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerName="nova-metadata-metadata" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.646974 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerName="registry-server" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.646983 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerName="registry-server" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.646995 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4623229-9cd3-4c95-bba1-1c202cb8f07c" containerName="nova-cell1-conductor-db-sync" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647004 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4623229-9cd3-4c95-bba1-1c202cb8f07c" containerName="nova-cell1-conductor-db-sync" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.647025 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de191a4-7cf0-4999-a102-b96a06b2ba24" containerName="nova-manage" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647032 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de191a4-7cf0-4999-a102-b96a06b2ba24" containerName="nova-manage" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.647047 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerName="extract-utilities" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647054 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerName="extract-utilities" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.647066 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee1455b-319a-4093-85ba-0b97e662ecf8" containerName="init" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647073 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee1455b-319a-4093-85ba-0b97e662ecf8" containerName="init" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.647094 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee1455b-319a-4093-85ba-0b97e662ecf8" containerName="dnsmasq-dns" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647101 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee1455b-319a-4093-85ba-0b97e662ecf8" containerName="dnsmasq-dns" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647338 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerName="nova-metadata-log" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647354 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" containerName="registry-server" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647363 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" containerName="nova-metadata-metadata" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647379 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4623229-9cd3-4c95-bba1-1c202cb8f07c" containerName="nova-cell1-conductor-db-sync" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647399 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de191a4-7cf0-4999-a102-b96a06b2ba24" containerName="nova-manage" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.647415 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee1455b-319a-4093-85ba-0b97e662ecf8" containerName="dnsmasq-dns" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.648176 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.656182 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.661230 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.678242 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848cf936-5ff0-4429-b4d4-5d0864e95cfc-kube-api-access-qrw6m" (OuterVolumeSpecName: "kube-api-access-qrw6m") pod "848cf936-5ff0-4429-b4d4-5d0864e95cfc" (UID: "848cf936-5ff0-4429-b4d4-5d0864e95cfc"). InnerVolumeSpecName "kube-api-access-qrw6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.698811 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-config-data" (OuterVolumeSpecName: "config-data") pod "848cf936-5ff0-4429-b4d4-5d0864e95cfc" (UID: "848cf936-5ff0-4429-b4d4-5d0864e95cfc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.711828 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "848cf936-5ff0-4429-b4d4-5d0864e95cfc" (UID: "848cf936-5ff0-4429-b4d4-5d0864e95cfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.740377 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "848cf936-5ff0-4429-b4d4-5d0864e95cfc" (UID: "848cf936-5ff0-4429-b4d4-5d0864e95cfc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.743328 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.743437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.743491 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z46qq\" (UniqueName: \"kubernetes.io/projected/1dfb6314-1f18-4e71-947e-534dc1021381-kube-api-access-z46qq\") pod \"nova-cell1-conductor-0\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.743556 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrw6m\" (UniqueName: \"kubernetes.io/projected/848cf936-5ff0-4429-b4d4-5d0864e95cfc-kube-api-access-qrw6m\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.743567 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/848cf936-5ff0-4429-b4d4-5d0864e95cfc-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.743615 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.743630 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.743641 4766 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/848cf936-5ff0-4429-b4d4-5d0864e95cfc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.815102 4766 scope.go:117] "RemoveContainer" containerID="6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.816038 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba\": container with ID starting with 6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba not found: ID does not exist" containerID="6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.816085 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba"} err="failed to get container status \"6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba\": rpc error: code = NotFound desc = could not find container \"6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba\": container with ID starting with 6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba not found: ID does not exist" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.816115 4766 scope.go:117] "RemoveContainer" containerID="0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037" Dec 09 03:36:40 crc kubenswrapper[4766]: E1209 03:36:40.816550 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037\": container with ID starting with 0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037 not found: ID does not exist" containerID="0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.816973 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037"} err="failed to get container status \"0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037\": rpc error: code = NotFound desc = could not find container \"0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037\": container with ID starting with 0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037 not found: ID does not exist" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.817059 4766 scope.go:117] "RemoveContainer" containerID="6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.817489 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba"} err="failed to get container status \"6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba\": rpc error: code = NotFound desc = could not find container \"6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba\": container with ID starting with 6ee74c41b265f3b5fec90c79842568059a01fbba65c44dd955c112b05e38dcba not found: ID does not exist" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.817520 4766 scope.go:117] "RemoveContainer" containerID="0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.817839 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037"} err="failed to get container status \"0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037\": rpc error: code = NotFound desc = could not find container \"0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037\": container with ID starting with 0f6692e66a6d013318e1cdba20ae7006618e576bfcf0245282981ea6973f2037 not found: ID does not exist" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.845936 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.846033 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z46qq\" (UniqueName: \"kubernetes.io/projected/1dfb6314-1f18-4e71-947e-534dc1021381-kube-api-access-z46qq\") pod \"nova-cell1-conductor-0\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.846073 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.850204 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b8094ea-63d8-4974-94d7-55e33813ed16" path="/var/lib/kubelet/pods/0b8094ea-63d8-4974-94d7-55e33813ed16/volumes" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.850912 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.851309 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee1455b-319a-4093-85ba-0b97e662ecf8" path="/var/lib/kubelet/pods/5ee1455b-319a-4093-85ba-0b97e662ecf8/volumes" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.851516 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.865794 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z46qq\" (UniqueName: \"kubernetes.io/projected/1dfb6314-1f18-4e71-947e-534dc1021381-kube-api-access-z46qq\") pod \"nova-cell1-conductor-0\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.913874 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.928844 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.943983 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.946588 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.947461 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-config-data\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.947763 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fe51de-b5c0-465d-81e1-7ad319e75a84-logs\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.947841 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.947938 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.947974 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vcsf\" (UniqueName: \"kubernetes.io/projected/95fe51de-b5c0-465d-81e1-7ad319e75a84-kube-api-access-4vcsf\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.949726 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.949915 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 09 03:36:40 crc kubenswrapper[4766]: I1209 03:36:40.972271 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.049819 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fe51de-b5c0-465d-81e1-7ad319e75a84-logs\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.049892 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.050831 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fe51de-b5c0-465d-81e1-7ad319e75a84-logs\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.050958 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.051000 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vcsf\" (UniqueName: \"kubernetes.io/projected/95fe51de-b5c0-465d-81e1-7ad319e75a84-kube-api-access-4vcsf\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.051043 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-config-data\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.055338 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-config-data\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.057308 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.057782 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.070677 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vcsf\" (UniqueName: \"kubernetes.io/projected/95fe51de-b5c0-465d-81e1-7ad319e75a84-kube-api-access-4vcsf\") pod \"nova-metadata-0\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.123537 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.263257 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.579778 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.586564 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c952df3c-b573-4763-948d-1e73d6d85514" containerName="nova-scheduler-scheduler" containerID="cri-o://75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8" gracePeriod=30 Dec 09 03:36:41 crc kubenswrapper[4766]: W1209 03:36:41.591182 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dfb6314_1f18_4e71_947e_534dc1021381.slice/crio-6027d6b4989aa9f0c12e49a78aff088e0e757cd4f6e7f56393c2ea903c89e979 WatchSource:0}: Error finding container 6027d6b4989aa9f0c12e49a78aff088e0e757cd4f6e7f56393c2ea903c89e979: Status 404 returned error can't find the container with id 6027d6b4989aa9f0c12e49a78aff088e0e757cd4f6e7f56393c2ea903c89e979 Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.670089 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.670302 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="292bc9d3-3333-4974-93c2-966d76dfa582" containerName="kube-state-metrics" containerID="cri-o://d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2" gracePeriod=30 Dec 09 03:36:41 crc kubenswrapper[4766]: W1209 03:36:41.752533 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95fe51de_b5c0_465d_81e1_7ad319e75a84.slice/crio-12c60facbfe0f40f165fa12ca257f5354d95d3f2c2930026341eb91185852bde WatchSource:0}: Error finding container 12c60facbfe0f40f165fa12ca257f5354d95d3f2c2930026341eb91185852bde: Status 404 returned error can't find the container with id 12c60facbfe0f40f165fa12ca257f5354d95d3f2c2930026341eb91185852bde Dec 09 03:36:41 crc kubenswrapper[4766]: I1209 03:36:41.754430 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.059410 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.076898 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vlxg\" (UniqueName: \"kubernetes.io/projected/292bc9d3-3333-4974-93c2-966d76dfa582-kube-api-access-9vlxg\") pod \"292bc9d3-3333-4974-93c2-966d76dfa582\" (UID: \"292bc9d3-3333-4974-93c2-966d76dfa582\") " Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.084688 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/292bc9d3-3333-4974-93c2-966d76dfa582-kube-api-access-9vlxg" (OuterVolumeSpecName: "kube-api-access-9vlxg") pod "292bc9d3-3333-4974-93c2-966d76dfa582" (UID: "292bc9d3-3333-4974-93c2-966d76dfa582"). InnerVolumeSpecName "kube-api-access-9vlxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.178573 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vlxg\" (UniqueName: \"kubernetes.io/projected/292bc9d3-3333-4974-93c2-966d76dfa582-kube-api-access-9vlxg\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.596826 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1dfb6314-1f18-4e71-947e-534dc1021381","Type":"ContainerStarted","Data":"9ccd949b9d58181646f9d2d30d971f167213b68716379d53e92812bc4142132d"} Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.596877 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1dfb6314-1f18-4e71-947e-534dc1021381","Type":"ContainerStarted","Data":"6027d6b4989aa9f0c12e49a78aff088e0e757cd4f6e7f56393c2ea903c89e979"} Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.596921 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.605417 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95fe51de-b5c0-465d-81e1-7ad319e75a84","Type":"ContainerStarted","Data":"e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a"} Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.605460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95fe51de-b5c0-465d-81e1-7ad319e75a84","Type":"ContainerStarted","Data":"c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae"} Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.605474 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95fe51de-b5c0-465d-81e1-7ad319e75a84","Type":"ContainerStarted","Data":"12c60facbfe0f40f165fa12ca257f5354d95d3f2c2930026341eb91185852bde"} Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.620471 4766 generic.go:334] "Generic (PLEG): container finished" podID="292bc9d3-3333-4974-93c2-966d76dfa582" containerID="d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2" exitCode=2 Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.620520 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"292bc9d3-3333-4974-93c2-966d76dfa582","Type":"ContainerDied","Data":"d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2"} Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.620538 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.620567 4766 scope.go:117] "RemoveContainer" containerID="d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.620552 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"292bc9d3-3333-4974-93c2-966d76dfa582","Type":"ContainerDied","Data":"61be5ff755701f82e78defc00960cb2d6e0ad703db6bd196417fc6a08a1d710a"} Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.629773 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.6297508130000002 podStartE2EDuration="2.629750813s" podCreationTimestamp="2025-12-09 03:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:36:42.623462784 +0000 UTC m=+1484.332768210" watchObservedRunningTime="2025-12-09 03:36:42.629750813 +0000 UTC m=+1484.339056249" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.653144 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.653125093 podStartE2EDuration="2.653125093s" podCreationTimestamp="2025-12-09 03:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:36:42.648854227 +0000 UTC m=+1484.358159663" watchObservedRunningTime="2025-12-09 03:36:42.653125093 +0000 UTC m=+1484.362430519" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.730188 4766 scope.go:117] "RemoveContainer" containerID="d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2" Dec 09 03:36:42 crc kubenswrapper[4766]: E1209 03:36:42.738487 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2\": container with ID starting with d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2 not found: ID does not exist" containerID="d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.738549 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2"} err="failed to get container status \"d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2\": rpc error: code = NotFound desc = could not find container \"d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2\": container with ID starting with d2d83217548422152d45a8b6b71fa0f1dd8f25322bda5d5524e1a55384e550c2 not found: ID does not exist" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.747293 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.765396 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.782129 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:36:42 crc kubenswrapper[4766]: E1209 03:36:42.782576 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292bc9d3-3333-4974-93c2-966d76dfa582" containerName="kube-state-metrics" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.782593 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="292bc9d3-3333-4974-93c2-966d76dfa582" containerName="kube-state-metrics" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.782760 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="292bc9d3-3333-4974-93c2-966d76dfa582" containerName="kube-state-metrics" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.783410 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.792609 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.796257 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.798752 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.850494 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="292bc9d3-3333-4974-93c2-966d76dfa582" path="/var/lib/kubelet/pods/292bc9d3-3333-4974-93c2-966d76dfa582/volumes" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.851282 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848cf936-5ff0-4429-b4d4-5d0864e95cfc" path="/var/lib/kubelet/pods/848cf936-5ff0-4429-b4d4-5d0864e95cfc/volumes" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.890715 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.890777 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.890808 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzvkc\" (UniqueName: \"kubernetes.io/projected/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-api-access-bzvkc\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.891116 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: E1209 03:36:42.909644 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod292bc9d3_3333_4974_93c2_966d76dfa582.slice/crio-61be5ff755701f82e78defc00960cb2d6e0ad703db6bd196417fc6a08a1d710a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod292bc9d3_3333_4974_93c2_966d76dfa582.slice\": RecentStats: unable to find data in memory cache]" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.997101 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.998584 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.998644 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzvkc\" (UniqueName: \"kubernetes.io/projected/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-api-access-bzvkc\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:42 crc kubenswrapper[4766]: I1209 03:36:42.998780 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.012895 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.014347 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.016176 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.028989 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzvkc\" (UniqueName: \"kubernetes.io/projected/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-api-access-bzvkc\") pod \"kube-state-metrics-0\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " pod="openstack/kube-state-metrics-0" Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.132659 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.613435 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.636982 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d1f2b76e-7443-46d3-a296-76196dcc28b7","Type":"ContainerStarted","Data":"be40d3f66ebf0fe4dd930179531e52cbdd602c3bb334f395b760f59c8ba17487"} Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.652310 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.654641 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="ceilometer-central-agent" containerID="cri-o://449d2dddaa0f97deefa2998a70a2009afc79446167e01a0e225a949bcd70b905" gracePeriod=30 Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.654797 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="proxy-httpd" containerID="cri-o://949ff15f71d458a94f6dfdcf0fdf4392ce78c50c4a716dd4dcce6614bb75c2ca" gracePeriod=30 Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.654847 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="sg-core" containerID="cri-o://aa1b6bc2d208e72f0a55decbffc820ca0e6a6e6ddd531379c4f4b69be452c224" gracePeriod=30 Dec 09 03:36:43 crc kubenswrapper[4766]: I1209 03:36:43.654903 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="ceilometer-notification-agent" containerID="cri-o://3db5b2f3ce4dc714b8a30ad8ec2e3babbb0b659bc2980b720781766e884a614d" gracePeriod=30 Dec 09 03:36:43 crc kubenswrapper[4766]: E1209 03:36:43.711830 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 03:36:43 crc kubenswrapper[4766]: E1209 03:36:43.713744 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 03:36:43 crc kubenswrapper[4766]: E1209 03:36:43.714786 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 03:36:43 crc kubenswrapper[4766]: E1209 03:36:43.714849 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c952df3c-b573-4763-948d-1e73d6d85514" containerName="nova-scheduler-scheduler" Dec 09 03:36:44 crc kubenswrapper[4766]: I1209 03:36:44.652030 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d1f2b76e-7443-46d3-a296-76196dcc28b7","Type":"ContainerStarted","Data":"521a97f36468751bc00689dc257329f3eef8640694009d32dbf2669dae3f1809"} Dec 09 03:36:44 crc kubenswrapper[4766]: I1209 03:36:44.652449 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 09 03:36:44 crc kubenswrapper[4766]: I1209 03:36:44.654896 4766 generic.go:334] "Generic (PLEG): container finished" podID="61498550-edcf-4bf5-8528-458189c6b171" containerID="949ff15f71d458a94f6dfdcf0fdf4392ce78c50c4a716dd4dcce6614bb75c2ca" exitCode=0 Dec 09 03:36:44 crc kubenswrapper[4766]: I1209 03:36:44.654922 4766 generic.go:334] "Generic (PLEG): container finished" podID="61498550-edcf-4bf5-8528-458189c6b171" containerID="aa1b6bc2d208e72f0a55decbffc820ca0e6a6e6ddd531379c4f4b69be452c224" exitCode=2 Dec 09 03:36:44 crc kubenswrapper[4766]: I1209 03:36:44.654931 4766 generic.go:334] "Generic (PLEG): container finished" podID="61498550-edcf-4bf5-8528-458189c6b171" containerID="449d2dddaa0f97deefa2998a70a2009afc79446167e01a0e225a949bcd70b905" exitCode=0 Dec 09 03:36:44 crc kubenswrapper[4766]: I1209 03:36:44.654949 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerDied","Data":"949ff15f71d458a94f6dfdcf0fdf4392ce78c50c4a716dd4dcce6614bb75c2ca"} Dec 09 03:36:44 crc kubenswrapper[4766]: I1209 03:36:44.654966 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerDied","Data":"aa1b6bc2d208e72f0a55decbffc820ca0e6a6e6ddd531379c4f4b69be452c224"} Dec 09 03:36:44 crc kubenswrapper[4766]: I1209 03:36:44.654976 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerDied","Data":"449d2dddaa0f97deefa2998a70a2009afc79446167e01a0e225a949bcd70b905"} Dec 09 03:36:44 crc kubenswrapper[4766]: I1209 03:36:44.668112 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.320020756 podStartE2EDuration="2.668091094s" podCreationTimestamp="2025-12-09 03:36:42 +0000 UTC" firstStartedPulling="2025-12-09 03:36:43.611120947 +0000 UTC m=+1485.320426373" lastFinishedPulling="2025-12-09 03:36:43.959191285 +0000 UTC m=+1485.668496711" observedRunningTime="2025-12-09 03:36:44.663751137 +0000 UTC m=+1486.373056563" watchObservedRunningTime="2025-12-09 03:36:44.668091094 +0000 UTC m=+1486.377396510" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.233540 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.264419 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.264472 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.359023 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-config-data\") pod \"c952df3c-b573-4763-948d-1e73d6d85514\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.359120 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-combined-ca-bundle\") pod \"c952df3c-b573-4763-948d-1e73d6d85514\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.359262 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-497nh\" (UniqueName: \"kubernetes.io/projected/c952df3c-b573-4763-948d-1e73d6d85514-kube-api-access-497nh\") pod \"c952df3c-b573-4763-948d-1e73d6d85514\" (UID: \"c952df3c-b573-4763-948d-1e73d6d85514\") " Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.367113 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c952df3c-b573-4763-948d-1e73d6d85514-kube-api-access-497nh" (OuterVolumeSpecName: "kube-api-access-497nh") pod "c952df3c-b573-4763-948d-1e73d6d85514" (UID: "c952df3c-b573-4763-948d-1e73d6d85514"). InnerVolumeSpecName "kube-api-access-497nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.397751 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c952df3c-b573-4763-948d-1e73d6d85514" (UID: "c952df3c-b573-4763-948d-1e73d6d85514"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.408753 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-config-data" (OuterVolumeSpecName: "config-data") pod "c952df3c-b573-4763-948d-1e73d6d85514" (UID: "c952df3c-b573-4763-948d-1e73d6d85514"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.462269 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.462304 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952df3c-b573-4763-948d-1e73d6d85514-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.462317 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-497nh\" (UniqueName: \"kubernetes.io/projected/c952df3c-b573-4763-948d-1e73d6d85514-kube-api-access-497nh\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.608276 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.681628 4766 generic.go:334] "Generic (PLEG): container finished" podID="c952df3c-b573-4763-948d-1e73d6d85514" containerID="75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8" exitCode=0 Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.681695 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.681723 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c952df3c-b573-4763-948d-1e73d6d85514","Type":"ContainerDied","Data":"75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8"} Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.681762 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c952df3c-b573-4763-948d-1e73d6d85514","Type":"ContainerDied","Data":"d592fc87f935ffa0f7ca9a56cd1689e5eca8ad64a55e9fb87e6e4ba639cc096e"} Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.681783 4766 scope.go:117] "RemoveContainer" containerID="75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.684625 4766 generic.go:334] "Generic (PLEG): container finished" podID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerID="f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5" exitCode=0 Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.684666 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"527525da-6fc1-4ed4-ab29-017bf44dd58e","Type":"ContainerDied","Data":"f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5"} Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.684695 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"527525da-6fc1-4ed4-ab29-017bf44dd58e","Type":"ContainerDied","Data":"976a6189b31cce0c7a8946982bee72faf96a800bdce19aaf5f9f3d02cba0e3e4"} Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.684763 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.720469 4766 scope.go:117] "RemoveContainer" containerID="75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8" Dec 09 03:36:46 crc kubenswrapper[4766]: E1209 03:36:46.721654 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8\": container with ID starting with 75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8 not found: ID does not exist" containerID="75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.721706 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8"} err="failed to get container status \"75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8\": rpc error: code = NotFound desc = could not find container \"75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8\": container with ID starting with 75b9db04ee525af11893db7de7c6d182f9d68a73187ba22c4f08c6074d1f74d8 not found: ID does not exist" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.721749 4766 scope.go:117] "RemoveContainer" containerID="f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.725440 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.736017 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.745687 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:36:46 crc kubenswrapper[4766]: E1209 03:36:46.746132 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-api" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.746156 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-api" Dec 09 03:36:46 crc kubenswrapper[4766]: E1209 03:36:46.746193 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-log" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.746202 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-log" Dec 09 03:36:46 crc kubenswrapper[4766]: E1209 03:36:46.746283 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c952df3c-b573-4763-948d-1e73d6d85514" containerName="nova-scheduler-scheduler" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.746305 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c952df3c-b573-4763-948d-1e73d6d85514" containerName="nova-scheduler-scheduler" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.746548 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c952df3c-b573-4763-948d-1e73d6d85514" containerName="nova-scheduler-scheduler" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.746582 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-log" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.746598 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" containerName="nova-api-api" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.747331 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.752947 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.767458 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-config-data\") pod \"527525da-6fc1-4ed4-ab29-017bf44dd58e\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.767527 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-combined-ca-bundle\") pod \"527525da-6fc1-4ed4-ab29-017bf44dd58e\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.767596 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vsbk\" (UniqueName: \"kubernetes.io/projected/527525da-6fc1-4ed4-ab29-017bf44dd58e-kube-api-access-5vsbk\") pod \"527525da-6fc1-4ed4-ab29-017bf44dd58e\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.767621 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527525da-6fc1-4ed4-ab29-017bf44dd58e-logs\") pod \"527525da-6fc1-4ed4-ab29-017bf44dd58e\" (UID: \"527525da-6fc1-4ed4-ab29-017bf44dd58e\") " Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.769598 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.775060 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527525da-6fc1-4ed4-ab29-017bf44dd58e-kube-api-access-5vsbk" (OuterVolumeSpecName: "kube-api-access-5vsbk") pod "527525da-6fc1-4ed4-ab29-017bf44dd58e" (UID: "527525da-6fc1-4ed4-ab29-017bf44dd58e"). InnerVolumeSpecName "kube-api-access-5vsbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.781013 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527525da-6fc1-4ed4-ab29-017bf44dd58e-logs" (OuterVolumeSpecName: "logs") pod "527525da-6fc1-4ed4-ab29-017bf44dd58e" (UID: "527525da-6fc1-4ed4-ab29-017bf44dd58e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.785339 4766 scope.go:117] "RemoveContainer" containerID="f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.802513 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-config-data" (OuterVolumeSpecName: "config-data") pod "527525da-6fc1-4ed4-ab29-017bf44dd58e" (UID: "527525da-6fc1-4ed4-ab29-017bf44dd58e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.806041 4766 scope.go:117] "RemoveContainer" containerID="f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5" Dec 09 03:36:46 crc kubenswrapper[4766]: E1209 03:36:46.806627 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5\": container with ID starting with f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5 not found: ID does not exist" containerID="f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.806674 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5"} err="failed to get container status \"f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5\": rpc error: code = NotFound desc = could not find container \"f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5\": container with ID starting with f716a92ef6041c6dd6c00121a956913c32eb32a706e7063e36413f966f708bd5 not found: ID does not exist" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.806705 4766 scope.go:117] "RemoveContainer" containerID="f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.807054 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "527525da-6fc1-4ed4-ab29-017bf44dd58e" (UID: "527525da-6fc1-4ed4-ab29-017bf44dd58e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:46 crc kubenswrapper[4766]: E1209 03:36:46.807311 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e\": container with ID starting with f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e not found: ID does not exist" containerID="f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.807354 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e"} err="failed to get container status \"f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e\": rpc error: code = NotFound desc = could not find container \"f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e\": container with ID starting with f3b7807a67f676d2556d24c1db7f32204b6a2ba090fccaf1598976ffc40e058e not found: ID does not exist" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.850880 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c952df3c-b573-4763-948d-1e73d6d85514" path="/var/lib/kubelet/pods/c952df3c-b573-4763-948d-1e73d6d85514/volumes" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.869436 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.869566 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wntwl\" (UniqueName: \"kubernetes.io/projected/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-kube-api-access-wntwl\") pod \"nova-scheduler-0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.869620 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-config-data\") pod \"nova-scheduler-0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.869725 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.869737 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vsbk\" (UniqueName: \"kubernetes.io/projected/527525da-6fc1-4ed4-ab29-017bf44dd58e-kube-api-access-5vsbk\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.869747 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527525da-6fc1-4ed4-ab29-017bf44dd58e-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.869760 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527525da-6fc1-4ed4-ab29-017bf44dd58e-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.976957 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-config-data\") pod \"nova-scheduler-0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.977138 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.977268 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wntwl\" (UniqueName: \"kubernetes.io/projected/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-kube-api-access-wntwl\") pod \"nova-scheduler-0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.984690 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.991695 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-config-data\") pod \"nova-scheduler-0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:46 crc kubenswrapper[4766]: I1209 03:36:46.998485 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wntwl\" (UniqueName: \"kubernetes.io/projected/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-kube-api-access-wntwl\") pod \"nova-scheduler-0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " pod="openstack/nova-scheduler-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.071110 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.090273 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.100902 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.128114 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.133570 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.139401 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.157907 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.282545 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.282585 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-logs\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.282634 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-config-data\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.282693 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcgxb\" (UniqueName: \"kubernetes.io/projected/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-kube-api-access-gcgxb\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.384136 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.384193 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-logs\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.384276 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-config-data\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.384364 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgxb\" (UniqueName: \"kubernetes.io/projected/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-kube-api-access-gcgxb\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.384965 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-logs\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.389590 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-config-data\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.392409 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.399717 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgxb\" (UniqueName: \"kubernetes.io/projected/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-kube-api-access-gcgxb\") pod \"nova-api-0\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.459428 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.523777 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:36:47 crc kubenswrapper[4766]: W1209 03:36:47.528719 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod485630d7_5fe3_4f68_a448_d12f6a7ba6b0.slice/crio-adc731ac9cdbd132ab466b0c36f3b9f827c4aca2f7f2483a51ba2951988b2d61 WatchSource:0}: Error finding container adc731ac9cdbd132ab466b0c36f3b9f827c4aca2f7f2483a51ba2951988b2d61: Status 404 returned error can't find the container with id adc731ac9cdbd132ab466b0c36f3b9f827c4aca2f7f2483a51ba2951988b2d61 Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.700299 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"485630d7-5fe3-4f68-a448-d12f6a7ba6b0","Type":"ContainerStarted","Data":"adc731ac9cdbd132ab466b0c36f3b9f827c4aca2f7f2483a51ba2951988b2d61"} Dec 09 03:36:47 crc kubenswrapper[4766]: I1209 03:36:47.887572 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:36:48 crc kubenswrapper[4766]: I1209 03:36:48.724637 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"485630d7-5fe3-4f68-a448-d12f6a7ba6b0","Type":"ContainerStarted","Data":"39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427"} Dec 09 03:36:48 crc kubenswrapper[4766]: I1209 03:36:48.727224 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b17e91-76c0-4fa6-93fc-0a2529699d9e","Type":"ContainerStarted","Data":"ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32"} Dec 09 03:36:48 crc kubenswrapper[4766]: I1209 03:36:48.727256 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b17e91-76c0-4fa6-93fc-0a2529699d9e","Type":"ContainerStarted","Data":"3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f"} Dec 09 03:36:48 crc kubenswrapper[4766]: I1209 03:36:48.727266 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b17e91-76c0-4fa6-93fc-0a2529699d9e","Type":"ContainerStarted","Data":"00c7366c3304a4765c861683110965627fd21732277fe9b34110bb72693227ca"} Dec 09 03:36:48 crc kubenswrapper[4766]: I1209 03:36:48.748983 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.748959568 podStartE2EDuration="2.748959568s" podCreationTimestamp="2025-12-09 03:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:36:48.742115484 +0000 UTC m=+1490.451420920" watchObservedRunningTime="2025-12-09 03:36:48.748959568 +0000 UTC m=+1490.458265014" Dec 09 03:36:48 crc kubenswrapper[4766]: I1209 03:36:48.766117 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.76609742 podStartE2EDuration="1.76609742s" podCreationTimestamp="2025-12-09 03:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:36:48.764702872 +0000 UTC m=+1490.474008308" watchObservedRunningTime="2025-12-09 03:36:48.76609742 +0000 UTC m=+1490.475402856" Dec 09 03:36:48 crc kubenswrapper[4766]: I1209 03:36:48.850462 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527525da-6fc1-4ed4-ab29-017bf44dd58e" path="/var/lib/kubelet/pods/527525da-6fc1-4ed4-ab29-017bf44dd58e/volumes" Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.748639 4766 generic.go:334] "Generic (PLEG): container finished" podID="61498550-edcf-4bf5-8528-458189c6b171" containerID="3db5b2f3ce4dc714b8a30ad8ec2e3babbb0b659bc2980b720781766e884a614d" exitCode=0 Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.748730 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerDied","Data":"3db5b2f3ce4dc714b8a30ad8ec2e3babbb0b659bc2980b720781766e884a614d"} Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.749270 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61498550-edcf-4bf5-8528-458189c6b171","Type":"ContainerDied","Data":"0ef240330be81bb6fd5b0ae89c2ddf06811cbba2ef5220a089c7b4266ccb9f17"} Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.749293 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ef240330be81bb6fd5b0ae89c2ddf06811cbba2ef5220a089c7b4266ccb9f17" Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.824712 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.958160 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-combined-ca-bundle\") pod \"61498550-edcf-4bf5-8528-458189c6b171\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.958292 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-config-data\") pod \"61498550-edcf-4bf5-8528-458189c6b171\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.958352 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-run-httpd\") pod \"61498550-edcf-4bf5-8528-458189c6b171\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.958417 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvc9c\" (UniqueName: \"kubernetes.io/projected/61498550-edcf-4bf5-8528-458189c6b171-kube-api-access-cvc9c\") pod \"61498550-edcf-4bf5-8528-458189c6b171\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.958454 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-sg-core-conf-yaml\") pod \"61498550-edcf-4bf5-8528-458189c6b171\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.958520 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-scripts\") pod \"61498550-edcf-4bf5-8528-458189c6b171\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.958588 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-log-httpd\") pod \"61498550-edcf-4bf5-8528-458189c6b171\" (UID: \"61498550-edcf-4bf5-8528-458189c6b171\") " Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.958785 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "61498550-edcf-4bf5-8528-458189c6b171" (UID: "61498550-edcf-4bf5-8528-458189c6b171"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.959307 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "61498550-edcf-4bf5-8528-458189c6b171" (UID: "61498550-edcf-4bf5-8528-458189c6b171"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.959659 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.959683 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61498550-edcf-4bf5-8528-458189c6b171-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.963479 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-scripts" (OuterVolumeSpecName: "scripts") pod "61498550-edcf-4bf5-8528-458189c6b171" (UID: "61498550-edcf-4bf5-8528-458189c6b171"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.963798 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61498550-edcf-4bf5-8528-458189c6b171-kube-api-access-cvc9c" (OuterVolumeSpecName: "kube-api-access-cvc9c") pod "61498550-edcf-4bf5-8528-458189c6b171" (UID: "61498550-edcf-4bf5-8528-458189c6b171"). InnerVolumeSpecName "kube-api-access-cvc9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:36:50 crc kubenswrapper[4766]: I1209 03:36:50.993507 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "61498550-edcf-4bf5-8528-458189c6b171" (UID: "61498550-edcf-4bf5-8528-458189c6b171"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.053509 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61498550-edcf-4bf5-8528-458189c6b171" (UID: "61498550-edcf-4bf5-8528-458189c6b171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.061467 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.061675 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvc9c\" (UniqueName: \"kubernetes.io/projected/61498550-edcf-4bf5-8528-458189c6b171-kube-api-access-cvc9c\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.061750 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.061823 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.073032 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-config-data" (OuterVolumeSpecName: "config-data") pod "61498550-edcf-4bf5-8528-458189c6b171" (UID: "61498550-edcf-4bf5-8528-458189c6b171"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.150899 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.162860 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61498550-edcf-4bf5-8528-458189c6b171-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.265152 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.265194 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.756437 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.796386 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.810731 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.825824 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:51 crc kubenswrapper[4766]: E1209 03:36:51.826330 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="proxy-httpd" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.826399 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="proxy-httpd" Dec 09 03:36:51 crc kubenswrapper[4766]: E1209 03:36:51.826418 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="ceilometer-central-agent" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.826424 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="ceilometer-central-agent" Dec 09 03:36:51 crc kubenswrapper[4766]: E1209 03:36:51.826434 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="ceilometer-notification-agent" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.826440 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="ceilometer-notification-agent" Dec 09 03:36:51 crc kubenswrapper[4766]: E1209 03:36:51.826450 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="sg-core" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.826457 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="sg-core" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.826652 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="ceilometer-notification-agent" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.826671 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="proxy-httpd" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.826683 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="sg-core" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.826694 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="61498550-edcf-4bf5-8528-458189c6b171" containerName="ceilometer-central-agent" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.828381 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.830910 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.831131 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.834015 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.854592 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.977128 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.977173 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-config-data\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.977252 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-scripts\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.977281 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-log-httpd\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.977500 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.977723 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7bc\" (UniqueName: \"kubernetes.io/projected/e42e356c-bce7-45d5-ac36-7a8c4b018a18-kube-api-access-zq7bc\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.977840 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:51 crc kubenswrapper[4766]: I1209 03:36:51.977943 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-run-httpd\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.072354 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.079392 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-log-httpd\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.079448 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.079505 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7bc\" (UniqueName: \"kubernetes.io/projected/e42e356c-bce7-45d5-ac36-7a8c4b018a18-kube-api-access-zq7bc\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.079548 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.079564 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-run-httpd\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.079597 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.079612 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-config-data\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.079636 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-scripts\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.080334 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-run-httpd\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.081410 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-log-httpd\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.083668 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.088084 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-config-data\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.088434 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-scripts\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.099629 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.110438 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.119052 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7bc\" (UniqueName: \"kubernetes.io/projected/e42e356c-bce7-45d5-ac36-7a8c4b018a18-kube-api-access-zq7bc\") pod \"ceilometer-0\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.149321 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.278482 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.278566 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.592363 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.771455 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerStarted","Data":"a74b6ec0a02cab0c00e40e15b9025ce2bec71df7e4824bd9a0e5899b50452d45"} Dec 09 03:36:52 crc kubenswrapper[4766]: I1209 03:36:52.854291 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61498550-edcf-4bf5-8528-458189c6b171" path="/var/lib/kubelet/pods/61498550-edcf-4bf5-8528-458189c6b171/volumes" Dec 09 03:36:53 crc kubenswrapper[4766]: I1209 03:36:53.140132 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 09 03:36:53 crc kubenswrapper[4766]: I1209 03:36:53.781584 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerStarted","Data":"899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd"} Dec 09 03:36:54 crc kubenswrapper[4766]: I1209 03:36:54.797233 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerStarted","Data":"f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2"} Dec 09 03:36:55 crc kubenswrapper[4766]: I1209 03:36:55.807580 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerStarted","Data":"cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c"} Dec 09 03:36:56 crc kubenswrapper[4766]: I1209 03:36:56.819616 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerStarted","Data":"727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063"} Dec 09 03:36:56 crc kubenswrapper[4766]: I1209 03:36:56.820024 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 03:36:56 crc kubenswrapper[4766]: I1209 03:36:56.853286 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.412578352 podStartE2EDuration="5.85326307s" podCreationTimestamp="2025-12-09 03:36:51 +0000 UTC" firstStartedPulling="2025-12-09 03:36:52.608593587 +0000 UTC m=+1494.317899023" lastFinishedPulling="2025-12-09 03:36:56.049278315 +0000 UTC m=+1497.758583741" observedRunningTime="2025-12-09 03:36:56.851920944 +0000 UTC m=+1498.561226370" watchObservedRunningTime="2025-12-09 03:36:56.85326307 +0000 UTC m=+1498.562568496" Dec 09 03:36:57 crc kubenswrapper[4766]: I1209 03:36:57.071649 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 03:36:57 crc kubenswrapper[4766]: I1209 03:36:57.097237 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 03:36:57 crc kubenswrapper[4766]: I1209 03:36:57.459736 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 03:36:57 crc kubenswrapper[4766]: I1209 03:36:57.461099 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 03:36:57 crc kubenswrapper[4766]: I1209 03:36:57.859028 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 03:36:58 crc kubenswrapper[4766]: I1209 03:36:58.541537 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 03:36:58 crc kubenswrapper[4766]: I1209 03:36:58.541537 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.005903 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qkg8d"] Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.008485 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.013732 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkg8d"] Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.112609 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgvkg\" (UniqueName: \"kubernetes.io/projected/109e3717-5e70-414f-9555-6d343ff2cf9b-kube-api-access-qgvkg\") pod \"redhat-marketplace-qkg8d\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.112658 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-utilities\") pod \"redhat-marketplace-qkg8d\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.112708 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-catalog-content\") pod \"redhat-marketplace-qkg8d\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.213590 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgvkg\" (UniqueName: \"kubernetes.io/projected/109e3717-5e70-414f-9555-6d343ff2cf9b-kube-api-access-qgvkg\") pod \"redhat-marketplace-qkg8d\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.213919 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-utilities\") pod \"redhat-marketplace-qkg8d\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.213971 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-catalog-content\") pod \"redhat-marketplace-qkg8d\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.214649 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-utilities\") pod \"redhat-marketplace-qkg8d\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.214666 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-catalog-content\") pod \"redhat-marketplace-qkg8d\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.238318 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgvkg\" (UniqueName: \"kubernetes.io/projected/109e3717-5e70-414f-9555-6d343ff2cf9b-kube-api-access-qgvkg\") pod \"redhat-marketplace-qkg8d\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.343177 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:36:59 crc kubenswrapper[4766]: I1209 03:36:59.872969 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkg8d"] Dec 09 03:37:00 crc kubenswrapper[4766]: I1209 03:37:00.870592 4766 generic.go:334] "Generic (PLEG): container finished" podID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerID="8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66" exitCode=0 Dec 09 03:37:00 crc kubenswrapper[4766]: I1209 03:37:00.870650 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkg8d" event={"ID":"109e3717-5e70-414f-9555-6d343ff2cf9b","Type":"ContainerDied","Data":"8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66"} Dec 09 03:37:00 crc kubenswrapper[4766]: I1209 03:37:00.871421 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkg8d" event={"ID":"109e3717-5e70-414f-9555-6d343ff2cf9b","Type":"ContainerStarted","Data":"12e95266c42982c3841e9c2ec0386bdd0cbc8b6f25f362198c3daf3fd26099b7"} Dec 09 03:37:01 crc kubenswrapper[4766]: I1209 03:37:01.269716 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 03:37:01 crc kubenswrapper[4766]: I1209 03:37:01.274320 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 03:37:01 crc kubenswrapper[4766]: I1209 03:37:01.290571 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 03:37:01 crc kubenswrapper[4766]: I1209 03:37:01.886813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkg8d" event={"ID":"109e3717-5e70-414f-9555-6d343ff2cf9b","Type":"ContainerStarted","Data":"1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892"} Dec 09 03:37:01 crc kubenswrapper[4766]: I1209 03:37:01.899529 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 03:37:02 crc kubenswrapper[4766]: I1209 03:37:02.904391 4766 generic.go:334] "Generic (PLEG): container finished" podID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerID="1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892" exitCode=0 Dec 09 03:37:02 crc kubenswrapper[4766]: I1209 03:37:02.904450 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkg8d" event={"ID":"109e3717-5e70-414f-9555-6d343ff2cf9b","Type":"ContainerDied","Data":"1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892"} Dec 09 03:37:03 crc kubenswrapper[4766]: I1209 03:37:03.919241 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkg8d" event={"ID":"109e3717-5e70-414f-9555-6d343ff2cf9b","Type":"ContainerStarted","Data":"b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d"} Dec 09 03:37:03 crc kubenswrapper[4766]: I1209 03:37:03.943344 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qkg8d" podStartSLOduration=3.479991787 podStartE2EDuration="5.943319277s" podCreationTimestamp="2025-12-09 03:36:58 +0000 UTC" firstStartedPulling="2025-12-09 03:37:00.873002911 +0000 UTC m=+1502.582308337" lastFinishedPulling="2025-12-09 03:37:03.336330361 +0000 UTC m=+1505.045635827" observedRunningTime="2025-12-09 03:37:03.940137112 +0000 UTC m=+1505.649442548" watchObservedRunningTime="2025-12-09 03:37:03.943319277 +0000 UTC m=+1505.652624723" Dec 09 03:37:04 crc kubenswrapper[4766]: I1209 03:37:04.890649 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:04 crc kubenswrapper[4766]: I1209 03:37:04.932854 4766 generic.go:334] "Generic (PLEG): container finished" podID="765c23d8-f967-4624-8a2e-82ecc9788177" containerID="c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744" exitCode=137 Dec 09 03:37:04 crc kubenswrapper[4766]: I1209 03:37:04.932901 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"765c23d8-f967-4624-8a2e-82ecc9788177","Type":"ContainerDied","Data":"c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744"} Dec 09 03:37:04 crc kubenswrapper[4766]: I1209 03:37:04.932946 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"765c23d8-f967-4624-8a2e-82ecc9788177","Type":"ContainerDied","Data":"576f4341c24a60f4bc27348fca37944e064d32ae2e06e3223928ba2911af1dae"} Dec 09 03:37:04 crc kubenswrapper[4766]: I1209 03:37:04.932963 4766 scope.go:117] "RemoveContainer" containerID="c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744" Dec 09 03:37:04 crc kubenswrapper[4766]: I1209 03:37:04.932920 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:04 crc kubenswrapper[4766]: I1209 03:37:04.961455 4766 scope.go:117] "RemoveContainer" containerID="c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744" Dec 09 03:37:04 crc kubenswrapper[4766]: E1209 03:37:04.962096 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744\": container with ID starting with c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744 not found: ID does not exist" containerID="c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744" Dec 09 03:37:04 crc kubenswrapper[4766]: I1209 03:37:04.962147 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744"} err="failed to get container status \"c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744\": rpc error: code = NotFound desc = could not find container \"c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744\": container with ID starting with c2b6ad6e49b0afd66fc570aafa95866eebe09fccec438e5d37712a4111518744 not found: ID does not exist" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.030546 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-config-data\") pod \"765c23d8-f967-4624-8a2e-82ecc9788177\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.031138 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-combined-ca-bundle\") pod \"765c23d8-f967-4624-8a2e-82ecc9788177\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.031313 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f9ms\" (UniqueName: \"kubernetes.io/projected/765c23d8-f967-4624-8a2e-82ecc9788177-kube-api-access-9f9ms\") pod \"765c23d8-f967-4624-8a2e-82ecc9788177\" (UID: \"765c23d8-f967-4624-8a2e-82ecc9788177\") " Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.036631 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/765c23d8-f967-4624-8a2e-82ecc9788177-kube-api-access-9f9ms" (OuterVolumeSpecName: "kube-api-access-9f9ms") pod "765c23d8-f967-4624-8a2e-82ecc9788177" (UID: "765c23d8-f967-4624-8a2e-82ecc9788177"). InnerVolumeSpecName "kube-api-access-9f9ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.065964 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-config-data" (OuterVolumeSpecName: "config-data") pod "765c23d8-f967-4624-8a2e-82ecc9788177" (UID: "765c23d8-f967-4624-8a2e-82ecc9788177"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.067416 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "765c23d8-f967-4624-8a2e-82ecc9788177" (UID: "765c23d8-f967-4624-8a2e-82ecc9788177"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.133136 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f9ms\" (UniqueName: \"kubernetes.io/projected/765c23d8-f967-4624-8a2e-82ecc9788177-kube-api-access-9f9ms\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.133470 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.133606 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765c23d8-f967-4624-8a2e-82ecc9788177-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.272812 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.281443 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.295159 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:37:05 crc kubenswrapper[4766]: E1209 03:37:05.295550 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765c23d8-f967-4624-8a2e-82ecc9788177" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.295571 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="765c23d8-f967-4624-8a2e-82ecc9788177" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.295841 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="765c23d8-f967-4624-8a2e-82ecc9788177" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.296447 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.298547 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.300107 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.300270 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.319946 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.336995 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxq6l\" (UniqueName: \"kubernetes.io/projected/354d9984-d7b5-4540-a96e-a68a7bf1b667-kube-api-access-dxq6l\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.337048 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.337105 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.337125 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.337155 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.438901 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxq6l\" (UniqueName: \"kubernetes.io/projected/354d9984-d7b5-4540-a96e-a68a7bf1b667-kube-api-access-dxq6l\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.439312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.439884 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.439909 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.439936 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.443824 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.443996 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.444785 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.444914 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.455687 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxq6l\" (UniqueName: \"kubernetes.io/projected/354d9984-d7b5-4540-a96e-a68a7bf1b667-kube-api-access-dxq6l\") pod \"nova-cell1-novncproxy-0\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:05 crc kubenswrapper[4766]: I1209 03:37:05.634698 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:06 crc kubenswrapper[4766]: I1209 03:37:06.159021 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:37:06 crc kubenswrapper[4766]: I1209 03:37:06.856573 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="765c23d8-f967-4624-8a2e-82ecc9788177" path="/var/lib/kubelet/pods/765c23d8-f967-4624-8a2e-82ecc9788177/volumes" Dec 09 03:37:06 crc kubenswrapper[4766]: I1209 03:37:06.958384 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"354d9984-d7b5-4540-a96e-a68a7bf1b667","Type":"ContainerStarted","Data":"ce107d8559532857d9f216c112581bb9f83cc5d523eef80c33104385461d818c"} Dec 09 03:37:06 crc kubenswrapper[4766]: I1209 03:37:06.958467 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"354d9984-d7b5-4540-a96e-a68a7bf1b667","Type":"ContainerStarted","Data":"9f2831e19b47104f2e7157a8f0f1a5b07c69cf3c45b38540e44f7797bf0fae3c"} Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.002044 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.002018191 podStartE2EDuration="2.002018191s" podCreationTimestamp="2025-12-09 03:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:37:06.983842242 +0000 UTC m=+1508.693147708" watchObservedRunningTime="2025-12-09 03:37:07.002018191 +0000 UTC m=+1508.711323627" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.317150 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.317262 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.317326 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.318465 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.318564 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" gracePeriod=600 Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.467733 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.469443 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.471003 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.473595 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 03:37:07 crc kubenswrapper[4766]: E1209 03:37:07.960044 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.972903 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" exitCode=0 Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.974195 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917"} Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.974278 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.974299 4766 scope.go:117] "RemoveContainer" containerID="86ed0df3c1c9e8bf00e2182a8d0f2c1317600b335e2702ed7b54e17bf114fc74" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.975131 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:37:07 crc kubenswrapper[4766]: E1209 03:37:07.975444 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:37:07 crc kubenswrapper[4766]: I1209 03:37:07.985782 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.223175 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-khb7n"] Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.227254 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.245770 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-khb7n"] Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.299625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.299686 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-config\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.299807 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgqf5\" (UniqueName: \"kubernetes.io/projected/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-kube-api-access-jgqf5\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.299857 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.299895 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.299917 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.401020 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.401115 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.401177 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-config\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.401278 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgqf5\" (UniqueName: \"kubernetes.io/projected/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-kube-api-access-jgqf5\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.401334 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.401377 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.402107 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.402140 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-config\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.402197 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.402464 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.402727 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.424203 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgqf5\" (UniqueName: \"kubernetes.io/projected/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-kube-api-access-jgqf5\") pod \"dnsmasq-dns-59cf4bdb65-khb7n\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:08 crc kubenswrapper[4766]: I1209 03:37:08.550890 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:09 crc kubenswrapper[4766]: I1209 03:37:09.047413 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-khb7n"] Dec 09 03:37:09 crc kubenswrapper[4766]: I1209 03:37:09.345070 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:37:09 crc kubenswrapper[4766]: I1209 03:37:09.345361 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:37:09 crc kubenswrapper[4766]: I1209 03:37:09.395813 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.011301 4766 generic.go:334] "Generic (PLEG): container finished" podID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" containerID="e1349528a0d97478daa0910811d2f8c4ed839212e46f900922e25dfb57923704" exitCode=0 Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.011384 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" event={"ID":"7443d5d9-873e-430e-bcad-a90f5d4ca9c6","Type":"ContainerDied","Data":"e1349528a0d97478daa0910811d2f8c4ed839212e46f900922e25dfb57923704"} Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.011804 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" event={"ID":"7443d5d9-873e-430e-bcad-a90f5d4ca9c6","Type":"ContainerStarted","Data":"93a744d7458295293fcc130bdda48bf9f5f93f092bcdf76a102a27cd922c0629"} Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.074521 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.133057 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkg8d"] Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.144667 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.144929 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="ceilometer-central-agent" containerID="cri-o://899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd" gracePeriod=30 Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.145533 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="ceilometer-notification-agent" containerID="cri-o://f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2" gracePeriod=30 Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.145679 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="sg-core" containerID="cri-o://cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c" gracePeriod=30 Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.145548 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="proxy-httpd" containerID="cri-o://727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063" gracePeriod=30 Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.156641 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.192:3000/\": EOF" Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.593021 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:10 crc kubenswrapper[4766]: I1209 03:37:10.635262 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.025479 4766 generic.go:334] "Generic (PLEG): container finished" podID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerID="727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063" exitCode=0 Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.025795 4766 generic.go:334] "Generic (PLEG): container finished" podID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerID="cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c" exitCode=2 Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.025803 4766 generic.go:334] "Generic (PLEG): container finished" podID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerID="899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd" exitCode=0 Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.025547 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerDied","Data":"727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063"} Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.025868 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerDied","Data":"cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c"} Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.025883 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerDied","Data":"899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd"} Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.030375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" event={"ID":"7443d5d9-873e-430e-bcad-a90f5d4ca9c6","Type":"ContainerStarted","Data":"247c3eaf1f8095705dd3d8948a06084c404d9dd6d14b121c931d90dca158ef0e"} Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.030735 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-log" containerID="cri-o://3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f" gracePeriod=30 Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.030828 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-api" containerID="cri-o://ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32" gracePeriod=30 Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.031021 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:11 crc kubenswrapper[4766]: I1209 03:37:11.048435 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" podStartSLOduration=3.048416141 podStartE2EDuration="3.048416141s" podCreationTimestamp="2025-12-09 03:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:37:11.046592111 +0000 UTC m=+1512.755897547" watchObservedRunningTime="2025-12-09 03:37:11.048416141 +0000 UTC m=+1512.757721577" Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.043880 4766 generic.go:334] "Generic (PLEG): container finished" podID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerID="3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f" exitCode=143 Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.044434 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qkg8d" podUID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerName="registry-server" containerID="cri-o://b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d" gracePeriod=2 Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.043966 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b17e91-76c0-4fa6-93fc-0a2529699d9e","Type":"ContainerDied","Data":"3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f"} Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.511894 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.684633 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-utilities\") pod \"109e3717-5e70-414f-9555-6d343ff2cf9b\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.684737 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-catalog-content\") pod \"109e3717-5e70-414f-9555-6d343ff2cf9b\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.684774 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgvkg\" (UniqueName: \"kubernetes.io/projected/109e3717-5e70-414f-9555-6d343ff2cf9b-kube-api-access-qgvkg\") pod \"109e3717-5e70-414f-9555-6d343ff2cf9b\" (UID: \"109e3717-5e70-414f-9555-6d343ff2cf9b\") " Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.685847 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-utilities" (OuterVolumeSpecName: "utilities") pod "109e3717-5e70-414f-9555-6d343ff2cf9b" (UID: "109e3717-5e70-414f-9555-6d343ff2cf9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.686437 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.696421 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/109e3717-5e70-414f-9555-6d343ff2cf9b-kube-api-access-qgvkg" (OuterVolumeSpecName: "kube-api-access-qgvkg") pod "109e3717-5e70-414f-9555-6d343ff2cf9b" (UID: "109e3717-5e70-414f-9555-6d343ff2cf9b"). InnerVolumeSpecName "kube-api-access-qgvkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.708366 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "109e3717-5e70-414f-9555-6d343ff2cf9b" (UID: "109e3717-5e70-414f-9555-6d343ff2cf9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.787110 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109e3717-5e70-414f-9555-6d343ff2cf9b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:12 crc kubenswrapper[4766]: I1209 03:37:12.787156 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgvkg\" (UniqueName: \"kubernetes.io/projected/109e3717-5e70-414f-9555-6d343ff2cf9b-kube-api-access-qgvkg\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.055474 4766 generic.go:334] "Generic (PLEG): container finished" podID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerID="b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d" exitCode=0 Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.055554 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkg8d" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.055574 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkg8d" event={"ID":"109e3717-5e70-414f-9555-6d343ff2cf9b","Type":"ContainerDied","Data":"b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d"} Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.056935 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkg8d" event={"ID":"109e3717-5e70-414f-9555-6d343ff2cf9b","Type":"ContainerDied","Data":"12e95266c42982c3841e9c2ec0386bdd0cbc8b6f25f362198c3daf3fd26099b7"} Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.056965 4766 scope.go:117] "RemoveContainer" containerID="b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.081561 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkg8d"] Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.091714 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkg8d"] Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.097702 4766 scope.go:117] "RemoveContainer" containerID="1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.129443 4766 scope.go:117] "RemoveContainer" containerID="8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.166120 4766 scope.go:117] "RemoveContainer" containerID="b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d" Dec 09 03:37:13 crc kubenswrapper[4766]: E1209 03:37:13.168839 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d\": container with ID starting with b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d not found: ID does not exist" containerID="b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.168965 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d"} err="failed to get container status \"b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d\": rpc error: code = NotFound desc = could not find container \"b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d\": container with ID starting with b6db1900efab1524b2bd254b69eae14dd318dd4206fe772a1c80bbbcb8c20d0d not found: ID does not exist" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.169060 4766 scope.go:117] "RemoveContainer" containerID="1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892" Dec 09 03:37:13 crc kubenswrapper[4766]: E1209 03:37:13.169462 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892\": container with ID starting with 1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892 not found: ID does not exist" containerID="1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.169578 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892"} err="failed to get container status \"1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892\": rpc error: code = NotFound desc = could not find container \"1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892\": container with ID starting with 1aebfb9ffc5c526c378ea2f7a6f9c8956d85b9cbe72f366ec5533dd97d2c1892 not found: ID does not exist" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.169810 4766 scope.go:117] "RemoveContainer" containerID="8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66" Dec 09 03:37:13 crc kubenswrapper[4766]: E1209 03:37:13.170649 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66\": container with ID starting with 8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66 not found: ID does not exist" containerID="8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66" Dec 09 03:37:13 crc kubenswrapper[4766]: I1209 03:37:13.170673 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66"} err="failed to get container status \"8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66\": rpc error: code = NotFound desc = could not find container \"8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66\": container with ID starting with 8161f0c58e5e58beebbac22043202c8eaa523de1aca4f7103e417be54ab50b66 not found: ID does not exist" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.614524 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.722948 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-combined-ca-bundle\") pod \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.723353 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcgxb\" (UniqueName: \"kubernetes.io/projected/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-kube-api-access-gcgxb\") pod \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.723388 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-config-data\") pod \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.723429 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-logs\") pod \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\" (UID: \"b4b17e91-76c0-4fa6-93fc-0a2529699d9e\") " Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.724116 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-logs" (OuterVolumeSpecName: "logs") pod "b4b17e91-76c0-4fa6-93fc-0a2529699d9e" (UID: "b4b17e91-76c0-4fa6-93fc-0a2529699d9e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.740585 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-kube-api-access-gcgxb" (OuterVolumeSpecName: "kube-api-access-gcgxb") pod "b4b17e91-76c0-4fa6-93fc-0a2529699d9e" (UID: "b4b17e91-76c0-4fa6-93fc-0a2529699d9e"). InnerVolumeSpecName "kube-api-access-gcgxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.758819 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4b17e91-76c0-4fa6-93fc-0a2529699d9e" (UID: "b4b17e91-76c0-4fa6-93fc-0a2529699d9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.768565 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-config-data" (OuterVolumeSpecName: "config-data") pod "b4b17e91-76c0-4fa6-93fc-0a2529699d9e" (UID: "b4b17e91-76c0-4fa6-93fc-0a2529699d9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.825994 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.826049 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcgxb\" (UniqueName: \"kubernetes.io/projected/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-kube-api-access-gcgxb\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.826078 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.826103 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4b17e91-76c0-4fa6-93fc-0a2529699d9e-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:14 crc kubenswrapper[4766]: I1209 03:37:14.849053 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="109e3717-5e70-414f-9555-6d343ff2cf9b" path="/var/lib/kubelet/pods/109e3717-5e70-414f-9555-6d343ff2cf9b/volumes" Dec 09 03:37:15 crc kubenswrapper[4766]: E1209 03:37:15.019095 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4b17e91_76c0_4fa6_93fc_0a2529699d9e.slice\": RecentStats: unable to find data in memory cache]" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.095133 4766 generic.go:334] "Generic (PLEG): container finished" podID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerID="ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32" exitCode=0 Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.095192 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b17e91-76c0-4fa6-93fc-0a2529699d9e","Type":"ContainerDied","Data":"ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32"} Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.095560 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b4b17e91-76c0-4fa6-93fc-0a2529699d9e","Type":"ContainerDied","Data":"00c7366c3304a4765c861683110965627fd21732277fe9b34110bb72693227ca"} Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.095588 4766 scope.go:117] "RemoveContainer" containerID="ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.095265 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.138275 4766 scope.go:117] "RemoveContainer" containerID="3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.140643 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.156667 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.170291 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.170840 4766 scope.go:117] "RemoveContainer" containerID="ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32" Dec 09 03:37:15 crc kubenswrapper[4766]: E1209 03:37:15.170841 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-api" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.170923 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-api" Dec 09 03:37:15 crc kubenswrapper[4766]: E1209 03:37:15.170941 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-log" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.170947 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-log" Dec 09 03:37:15 crc kubenswrapper[4766]: E1209 03:37:15.170960 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerName="registry-server" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.170966 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerName="registry-server" Dec 09 03:37:15 crc kubenswrapper[4766]: E1209 03:37:15.170982 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerName="extract-utilities" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.170988 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerName="extract-utilities" Dec 09 03:37:15 crc kubenswrapper[4766]: E1209 03:37:15.171009 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerName="extract-content" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.171014 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerName="extract-content" Dec 09 03:37:15 crc kubenswrapper[4766]: E1209 03:37:15.171163 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32\": container with ID starting with ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32 not found: ID does not exist" containerID="ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.171189 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32"} err="failed to get container status \"ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32\": rpc error: code = NotFound desc = could not find container \"ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32\": container with ID starting with ce44e7b49373221db26c0c8fc98e011d355258c4c60e906bfa939189bbd58f32 not found: ID does not exist" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.171224 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="109e3717-5e70-414f-9555-6d343ff2cf9b" containerName="registry-server" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.171230 4766 scope.go:117] "RemoveContainer" containerID="3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.171256 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-log" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.171272 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" containerName="nova-api-api" Dec 09 03:37:15 crc kubenswrapper[4766]: E1209 03:37:15.171449 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f\": container with ID starting with 3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f not found: ID does not exist" containerID="3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.171465 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f"} err="failed to get container status \"3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f\": rpc error: code = NotFound desc = could not find container \"3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f\": container with ID starting with 3c8324fc0982bc0b6c2d43e02378b51a745773fb0c29ef83c57ff2dea5c81d5f not found: ID does not exist" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.172199 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.174064 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.174335 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.174369 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.181125 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.233568 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-public-tls-certs\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.233629 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.233883 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-internal-tls-certs\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.233942 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-config-data\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.234056 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30bb330d-d600-4d0f-86e4-edd3da24bc92-logs\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.234135 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vj2w\" (UniqueName: \"kubernetes.io/projected/30bb330d-d600-4d0f-86e4-edd3da24bc92-kube-api-access-9vj2w\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.335636 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-internal-tls-certs\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.335676 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-config-data\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.335709 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30bb330d-d600-4d0f-86e4-edd3da24bc92-logs\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.335740 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vj2w\" (UniqueName: \"kubernetes.io/projected/30bb330d-d600-4d0f-86e4-edd3da24bc92-kube-api-access-9vj2w\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.335786 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-public-tls-certs\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.335814 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.336494 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30bb330d-d600-4d0f-86e4-edd3da24bc92-logs\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.339927 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.340366 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-public-tls-certs\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.340682 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-config-data\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.340974 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-internal-tls-certs\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.358199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vj2w\" (UniqueName: \"kubernetes.io/projected/30bb330d-d600-4d0f-86e4-edd3da24bc92-kube-api-access-9vj2w\") pod \"nova-api-0\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.516785 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.635885 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.660177 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.660911 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.746366 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-config-data\") pod \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.746540 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq7bc\" (UniqueName: \"kubernetes.io/projected/e42e356c-bce7-45d5-ac36-7a8c4b018a18-kube-api-access-zq7bc\") pod \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.746623 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-log-httpd\") pod \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.746730 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-run-httpd\") pod \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.746846 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-ceilometer-tls-certs\") pod \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.746906 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-sg-core-conf-yaml\") pod \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.746943 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-scripts\") pod \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.747049 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-combined-ca-bundle\") pod \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\" (UID: \"e42e356c-bce7-45d5-ac36-7a8c4b018a18\") " Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.747288 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e42e356c-bce7-45d5-ac36-7a8c4b018a18" (UID: "e42e356c-bce7-45d5-ac36-7a8c4b018a18"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.747368 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e42e356c-bce7-45d5-ac36-7a8c4b018a18" (UID: "e42e356c-bce7-45d5-ac36-7a8c4b018a18"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.747670 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.747684 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e42e356c-bce7-45d5-ac36-7a8c4b018a18-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.751091 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42e356c-bce7-45d5-ac36-7a8c4b018a18-kube-api-access-zq7bc" (OuterVolumeSpecName: "kube-api-access-zq7bc") pod "e42e356c-bce7-45d5-ac36-7a8c4b018a18" (UID: "e42e356c-bce7-45d5-ac36-7a8c4b018a18"). InnerVolumeSpecName "kube-api-access-zq7bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.753406 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-scripts" (OuterVolumeSpecName: "scripts") pod "e42e356c-bce7-45d5-ac36-7a8c4b018a18" (UID: "e42e356c-bce7-45d5-ac36-7a8c4b018a18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.778409 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e42e356c-bce7-45d5-ac36-7a8c4b018a18" (UID: "e42e356c-bce7-45d5-ac36-7a8c4b018a18"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.817144 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e42e356c-bce7-45d5-ac36-7a8c4b018a18" (UID: "e42e356c-bce7-45d5-ac36-7a8c4b018a18"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.819555 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e42e356c-bce7-45d5-ac36-7a8c4b018a18" (UID: "e42e356c-bce7-45d5-ac36-7a8c4b018a18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.849661 4766 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.849692 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.849701 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.849711 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.849721 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq7bc\" (UniqueName: \"kubernetes.io/projected/e42e356c-bce7-45d5-ac36-7a8c4b018a18-kube-api-access-zq7bc\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.856685 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-config-data" (OuterVolumeSpecName: "config-data") pod "e42e356c-bce7-45d5-ac36-7a8c4b018a18" (UID: "e42e356c-bce7-45d5-ac36-7a8c4b018a18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:15 crc kubenswrapper[4766]: I1209 03:37:15.950945 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42e356c-bce7-45d5-ac36-7a8c4b018a18-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:16 crc kubenswrapper[4766]: W1209 03:37:16.008462 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30bb330d_d600_4d0f_86e4_edd3da24bc92.slice/crio-4d041b27280edf1b35d0f6818927d068f9d6303baa447becf15418c6afce455d WatchSource:0}: Error finding container 4d041b27280edf1b35d0f6818927d068f9d6303baa447becf15418c6afce455d: Status 404 returned error can't find the container with id 4d041b27280edf1b35d0f6818927d068f9d6303baa447becf15418c6afce455d Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.013253 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.107124 4766 generic.go:334] "Generic (PLEG): container finished" podID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerID="f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2" exitCode=0 Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.107175 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerDied","Data":"f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2"} Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.107562 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e42e356c-bce7-45d5-ac36-7a8c4b018a18","Type":"ContainerDied","Data":"a74b6ec0a02cab0c00e40e15b9025ce2bec71df7e4824bd9a0e5899b50452d45"} Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.107296 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.107584 4766 scope.go:117] "RemoveContainer" containerID="727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.109777 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30bb330d-d600-4d0f-86e4-edd3da24bc92","Type":"ContainerStarted","Data":"4d041b27280edf1b35d0f6818927d068f9d6303baa447becf15418c6afce455d"} Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.133925 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.150125 4766 scope.go:117] "RemoveContainer" containerID="cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.152449 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.163503 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.191453 4766 scope.go:117] "RemoveContainer" containerID="f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.202360 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:37:16 crc kubenswrapper[4766]: E1209 03:37:16.204273 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="ceilometer-central-agent" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.204299 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="ceilometer-central-agent" Dec 09 03:37:16 crc kubenswrapper[4766]: E1209 03:37:16.204318 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="proxy-httpd" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.204326 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="proxy-httpd" Dec 09 03:37:16 crc kubenswrapper[4766]: E1209 03:37:16.204341 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="ceilometer-notification-agent" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.204350 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="ceilometer-notification-agent" Dec 09 03:37:16 crc kubenswrapper[4766]: E1209 03:37:16.204362 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="sg-core" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.204369 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="sg-core" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.204572 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="ceilometer-central-agent" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.204589 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="proxy-httpd" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.204610 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="sg-core" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.204629 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" containerName="ceilometer-notification-agent" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.211263 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.213499 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.214600 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.214732 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.239460 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.247496 4766 scope.go:117] "RemoveContainer" containerID="899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.254724 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.254758 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-scripts\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.254777 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-config-data\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.254864 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.255098 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-run-httpd\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.255191 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68nf2\" (UniqueName: \"kubernetes.io/projected/54e4f191-0150-4bdb-9afa-2cc5164c6b55-kube-api-access-68nf2\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.255348 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.255411 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-log-httpd\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.268427 4766 scope.go:117] "RemoveContainer" containerID="727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063" Dec 09 03:37:16 crc kubenswrapper[4766]: E1209 03:37:16.269982 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063\": container with ID starting with 727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063 not found: ID does not exist" containerID="727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.270046 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063"} err="failed to get container status \"727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063\": rpc error: code = NotFound desc = could not find container \"727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063\": container with ID starting with 727e7d8627b254990a7bbdf2121b2fc5169c3f314cef193cb29f6036e2959063 not found: ID does not exist" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.270083 4766 scope.go:117] "RemoveContainer" containerID="cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c" Dec 09 03:37:16 crc kubenswrapper[4766]: E1209 03:37:16.270896 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c\": container with ID starting with cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c not found: ID does not exist" containerID="cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.270924 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c"} err="failed to get container status \"cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c\": rpc error: code = NotFound desc = could not find container \"cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c\": container with ID starting with cf657f49aed1d455ce9e876b38dcf7f8b7e7ffa1b31d054f80793a6d7b14708c not found: ID does not exist" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.270943 4766 scope.go:117] "RemoveContainer" containerID="f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2" Dec 09 03:37:16 crc kubenswrapper[4766]: E1209 03:37:16.271190 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2\": container with ID starting with f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2 not found: ID does not exist" containerID="f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.271337 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2"} err="failed to get container status \"f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2\": rpc error: code = NotFound desc = could not find container \"f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2\": container with ID starting with f1d14d289c08cd5803b3207dbd47d1031aaa0169d3439fac293265fb30c21ea2 not found: ID does not exist" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.271365 4766 scope.go:117] "RemoveContainer" containerID="899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd" Dec 09 03:37:16 crc kubenswrapper[4766]: E1209 03:37:16.271695 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd\": container with ID starting with 899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd not found: ID does not exist" containerID="899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.271722 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd"} err="failed to get container status \"899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd\": rpc error: code = NotFound desc = could not find container \"899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd\": container with ID starting with 899476d113cfd319ba25af3765cdaa3d7697c7103e5023f7569a593d532ff3cd not found: ID does not exist" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.357274 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.358006 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-scripts\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.358155 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-config-data\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.358320 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.358477 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-run-httpd\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.358641 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68nf2\" (UniqueName: \"kubernetes.io/projected/54e4f191-0150-4bdb-9afa-2cc5164c6b55-kube-api-access-68nf2\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.360596 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.360952 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-log-httpd\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.359255 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-run-httpd\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.361653 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-log-httpd\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.374044 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.374582 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-scripts\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.375137 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.379928 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.394046 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-config-data\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.397454 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-j4snv"] Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.400756 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.402415 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68nf2\" (UniqueName: \"kubernetes.io/projected/54e4f191-0150-4bdb-9afa-2cc5164c6b55-kube-api-access-68nf2\") pod \"ceilometer-0\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.405153 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.405555 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.416512 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j4snv"] Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.461565 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.461618 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-config-data\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.461674 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-scripts\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.461711 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lsns\" (UniqueName: \"kubernetes.io/projected/9a559b56-2a81-4f1f-b42a-c46550710c47-kube-api-access-2lsns\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.542026 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.562607 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-scripts\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.562671 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lsns\" (UniqueName: \"kubernetes.io/projected/9a559b56-2a81-4f1f-b42a-c46550710c47-kube-api-access-2lsns\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.562744 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.562773 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-config-data\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.568937 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-scripts\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.569245 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-config-data\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.569601 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.582167 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lsns\" (UniqueName: \"kubernetes.io/projected/9a559b56-2a81-4f1f-b42a-c46550710c47-kube-api-access-2lsns\") pod \"nova-cell1-cell-mapping-j4snv\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.819306 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.849958 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b17e91-76c0-4fa6-93fc-0a2529699d9e" path="/var/lib/kubelet/pods/b4b17e91-76c0-4fa6-93fc-0a2529699d9e/volumes" Dec 09 03:37:16 crc kubenswrapper[4766]: I1209 03:37:16.850777 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42e356c-bce7-45d5-ac36-7a8c4b018a18" path="/var/lib/kubelet/pods/e42e356c-bce7-45d5-ac36-7a8c4b018a18/volumes" Dec 09 03:37:17 crc kubenswrapper[4766]: I1209 03:37:17.010464 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:37:17 crc kubenswrapper[4766]: W1209 03:37:17.014982 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54e4f191_0150_4bdb_9afa_2cc5164c6b55.slice/crio-6dc105c8b5ca5bf8bcc72f794af15cdec8e465fa04414c427ef2eae773c478cc WatchSource:0}: Error finding container 6dc105c8b5ca5bf8bcc72f794af15cdec8e465fa04414c427ef2eae773c478cc: Status 404 returned error can't find the container with id 6dc105c8b5ca5bf8bcc72f794af15cdec8e465fa04414c427ef2eae773c478cc Dec 09 03:37:17 crc kubenswrapper[4766]: I1209 03:37:17.123021 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30bb330d-d600-4d0f-86e4-edd3da24bc92","Type":"ContainerStarted","Data":"6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055"} Dec 09 03:37:17 crc kubenswrapper[4766]: I1209 03:37:17.123067 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30bb330d-d600-4d0f-86e4-edd3da24bc92","Type":"ContainerStarted","Data":"7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119"} Dec 09 03:37:17 crc kubenswrapper[4766]: I1209 03:37:17.124332 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerStarted","Data":"6dc105c8b5ca5bf8bcc72f794af15cdec8e465fa04414c427ef2eae773c478cc"} Dec 09 03:37:17 crc kubenswrapper[4766]: I1209 03:37:17.155669 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.155644924 podStartE2EDuration="2.155644924s" podCreationTimestamp="2025-12-09 03:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:37:17.14694089 +0000 UTC m=+1518.856246336" watchObservedRunningTime="2025-12-09 03:37:17.155644924 +0000 UTC m=+1518.864950350" Dec 09 03:37:17 crc kubenswrapper[4766]: I1209 03:37:17.229587 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j4snv"] Dec 09 03:37:17 crc kubenswrapper[4766]: W1209 03:37:17.230451 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a559b56_2a81_4f1f_b42a_c46550710c47.slice/crio-0f58c0b426f0b6e28c919c79fb2a305bd5ba4112e9665b15d7464905b02897c9 WatchSource:0}: Error finding container 0f58c0b426f0b6e28c919c79fb2a305bd5ba4112e9665b15d7464905b02897c9: Status 404 returned error can't find the container with id 0f58c0b426f0b6e28c919c79fb2a305bd5ba4112e9665b15d7464905b02897c9 Dec 09 03:37:18 crc kubenswrapper[4766]: I1209 03:37:18.143096 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerStarted","Data":"d74114f82622e892b4a5e3b24618b3e3f03c2d8600dbbfb022a279d06521b94a"} Dec 09 03:37:18 crc kubenswrapper[4766]: I1209 03:37:18.151184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j4snv" event={"ID":"9a559b56-2a81-4f1f-b42a-c46550710c47","Type":"ContainerStarted","Data":"63bf0756d28bad553080eb2bde20d0a5a0b4bb3782133960bfc5df7f50fdbbea"} Dec 09 03:37:18 crc kubenswrapper[4766]: I1209 03:37:18.151260 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j4snv" event={"ID":"9a559b56-2a81-4f1f-b42a-c46550710c47","Type":"ContainerStarted","Data":"0f58c0b426f0b6e28c919c79fb2a305bd5ba4112e9665b15d7464905b02897c9"} Dec 09 03:37:18 crc kubenswrapper[4766]: I1209 03:37:18.553461 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:37:18 crc kubenswrapper[4766]: I1209 03:37:18.598765 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-j4snv" podStartSLOduration=2.598740662 podStartE2EDuration="2.598740662s" podCreationTimestamp="2025-12-09 03:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:37:18.176867633 +0000 UTC m=+1519.886173089" watchObservedRunningTime="2025-12-09 03:37:18.598740662 +0000 UTC m=+1520.308046128" Dec 09 03:37:18 crc kubenswrapper[4766]: I1209 03:37:18.658871 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dgqtt"] Dec 09 03:37:18 crc kubenswrapper[4766]: I1209 03:37:18.659111 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" podUID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" containerName="dnsmasq-dns" containerID="cri-o://ce3ccf75f0cf375ecd11a04f0e8f5e4eaab059c87c848791d6110f0f4ae70604" gracePeriod=10 Dec 09 03:37:18 crc kubenswrapper[4766]: I1209 03:37:18.720484 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" podUID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.183:5353: connect: connection refused" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.157810 4766 generic.go:334] "Generic (PLEG): container finished" podID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" containerID="ce3ccf75f0cf375ecd11a04f0e8f5e4eaab059c87c848791d6110f0f4ae70604" exitCode=0 Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.158002 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" event={"ID":"fd933eb5-5f13-4e72-910c-aa495cfae9f3","Type":"ContainerDied","Data":"ce3ccf75f0cf375ecd11a04f0e8f5e4eaab059c87c848791d6110f0f4ae70604"} Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.158162 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" event={"ID":"fd933eb5-5f13-4e72-910c-aa495cfae9f3","Type":"ContainerDied","Data":"9615a2ea011fcb06065675ed4612ac1831a66023d62abf069c5d66631ad5e92f"} Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.158176 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9615a2ea011fcb06065675ed4612ac1831a66023d62abf069c5d66631ad5e92f" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.160821 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerStarted","Data":"6d71bb4e4b5fbaf252c35fbed7673f51dd06fc7cc2b55da69c2bf9d00916cfed"} Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.160843 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerStarted","Data":"f4f799754c7d424d9e1a02edb786aa1d853e958a00bbe81227da9a2cf14ca4ee"} Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.200498 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.236759 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-swift-storage-0\") pod \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.236914 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-config\") pod \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.236940 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d29dz\" (UniqueName: \"kubernetes.io/projected/fd933eb5-5f13-4e72-910c-aa495cfae9f3-kube-api-access-d29dz\") pod \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.236968 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-nb\") pod \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.236991 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-svc\") pod \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.237018 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-sb\") pod \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\" (UID: \"fd933eb5-5f13-4e72-910c-aa495cfae9f3\") " Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.250098 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd933eb5-5f13-4e72-910c-aa495cfae9f3-kube-api-access-d29dz" (OuterVolumeSpecName: "kube-api-access-d29dz") pod "fd933eb5-5f13-4e72-910c-aa495cfae9f3" (UID: "fd933eb5-5f13-4e72-910c-aa495cfae9f3"). InnerVolumeSpecName "kube-api-access-d29dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.310497 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fd933eb5-5f13-4e72-910c-aa495cfae9f3" (UID: "fd933eb5-5f13-4e72-910c-aa495cfae9f3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.311564 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd933eb5-5f13-4e72-910c-aa495cfae9f3" (UID: "fd933eb5-5f13-4e72-910c-aa495cfae9f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.319163 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd933eb5-5f13-4e72-910c-aa495cfae9f3" (UID: "fd933eb5-5f13-4e72-910c-aa495cfae9f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.327509 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-config" (OuterVolumeSpecName: "config") pod "fd933eb5-5f13-4e72-910c-aa495cfae9f3" (UID: "fd933eb5-5f13-4e72-910c-aa495cfae9f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.337827 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd933eb5-5f13-4e72-910c-aa495cfae9f3" (UID: "fd933eb5-5f13-4e72-910c-aa495cfae9f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.339003 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.339019 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.339041 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d29dz\" (UniqueName: \"kubernetes.io/projected/fd933eb5-5f13-4e72-910c-aa495cfae9f3-kube-api-access-d29dz\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.339052 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.339061 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.339069 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd933eb5-5f13-4e72-910c-aa495cfae9f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:19 crc kubenswrapper[4766]: I1209 03:37:19.839345 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:37:19 crc kubenswrapper[4766]: E1209 03:37:19.839842 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:37:20 crc kubenswrapper[4766]: I1209 03:37:20.178581 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-dgqtt" Dec 09 03:37:20 crc kubenswrapper[4766]: I1209 03:37:20.282268 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dgqtt"] Dec 09 03:37:20 crc kubenswrapper[4766]: I1209 03:37:20.294762 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dgqtt"] Dec 09 03:37:20 crc kubenswrapper[4766]: I1209 03:37:20.856779 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" path="/var/lib/kubelet/pods/fd933eb5-5f13-4e72-910c-aa495cfae9f3/volumes" Dec 09 03:37:21 crc kubenswrapper[4766]: I1209 03:37:21.189187 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerStarted","Data":"7f14118b1949b7f4fa11ccf3e1c0978b97f42aeb4702c3b2333fce0c82a717cf"} Dec 09 03:37:21 crc kubenswrapper[4766]: I1209 03:37:21.191575 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 03:37:21 crc kubenswrapper[4766]: I1209 03:37:21.223404 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.137076742 podStartE2EDuration="5.223376959s" podCreationTimestamp="2025-12-09 03:37:16 +0000 UTC" firstStartedPulling="2025-12-09 03:37:17.017931843 +0000 UTC m=+1518.727237269" lastFinishedPulling="2025-12-09 03:37:20.10423206 +0000 UTC m=+1521.813537486" observedRunningTime="2025-12-09 03:37:21.214317885 +0000 UTC m=+1522.923623351" watchObservedRunningTime="2025-12-09 03:37:21.223376959 +0000 UTC m=+1522.932682385" Dec 09 03:37:23 crc kubenswrapper[4766]: I1209 03:37:23.212732 4766 generic.go:334] "Generic (PLEG): container finished" podID="9a559b56-2a81-4f1f-b42a-c46550710c47" containerID="63bf0756d28bad553080eb2bde20d0a5a0b4bb3782133960bfc5df7f50fdbbea" exitCode=0 Dec 09 03:37:23 crc kubenswrapper[4766]: I1209 03:37:23.212795 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j4snv" event={"ID":"9a559b56-2a81-4f1f-b42a-c46550710c47","Type":"ContainerDied","Data":"63bf0756d28bad553080eb2bde20d0a5a0b4bb3782133960bfc5df7f50fdbbea"} Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.665159 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.765868 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-combined-ca-bundle\") pod \"9a559b56-2a81-4f1f-b42a-c46550710c47\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.765983 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-config-data\") pod \"9a559b56-2a81-4f1f-b42a-c46550710c47\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.766345 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lsns\" (UniqueName: \"kubernetes.io/projected/9a559b56-2a81-4f1f-b42a-c46550710c47-kube-api-access-2lsns\") pod \"9a559b56-2a81-4f1f-b42a-c46550710c47\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.766383 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-scripts\") pod \"9a559b56-2a81-4f1f-b42a-c46550710c47\" (UID: \"9a559b56-2a81-4f1f-b42a-c46550710c47\") " Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.773306 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a559b56-2a81-4f1f-b42a-c46550710c47-kube-api-access-2lsns" (OuterVolumeSpecName: "kube-api-access-2lsns") pod "9a559b56-2a81-4f1f-b42a-c46550710c47" (UID: "9a559b56-2a81-4f1f-b42a-c46550710c47"). InnerVolumeSpecName "kube-api-access-2lsns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.773524 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-scripts" (OuterVolumeSpecName: "scripts") pod "9a559b56-2a81-4f1f-b42a-c46550710c47" (UID: "9a559b56-2a81-4f1f-b42a-c46550710c47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.802489 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a559b56-2a81-4f1f-b42a-c46550710c47" (UID: "9a559b56-2a81-4f1f-b42a-c46550710c47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.802522 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-config-data" (OuterVolumeSpecName: "config-data") pod "9a559b56-2a81-4f1f-b42a-c46550710c47" (UID: "9a559b56-2a81-4f1f-b42a-c46550710c47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.869880 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lsns\" (UniqueName: \"kubernetes.io/projected/9a559b56-2a81-4f1f-b42a-c46550710c47-kube-api-access-2lsns\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.869910 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.869921 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:24 crc kubenswrapper[4766]: I1209 03:37:24.869931 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a559b56-2a81-4f1f-b42a-c46550710c47-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.233550 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j4snv" event={"ID":"9a559b56-2a81-4f1f-b42a-c46550710c47","Type":"ContainerDied","Data":"0f58c0b426f0b6e28c919c79fb2a305bd5ba4112e9665b15d7464905b02897c9"} Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.233595 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f58c0b426f0b6e28c919c79fb2a305bd5ba4112e9665b15d7464905b02897c9" Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.233604 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j4snv" Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.400425 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.401034 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="485630d7-5fe3-4f68-a448-d12f6a7ba6b0" containerName="nova-scheduler-scheduler" containerID="cri-o://39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427" gracePeriod=30 Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.416756 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.417024 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerName="nova-api-log" containerID="cri-o://7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119" gracePeriod=30 Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.417087 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerName="nova-api-api" containerID="cri-o://6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055" gracePeriod=30 Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.493291 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.493662 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-metadata" containerID="cri-o://e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a" gracePeriod=30 Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.493569 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-log" containerID="cri-o://c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae" gracePeriod=30 Dec 09 03:37:25 crc kubenswrapper[4766]: I1209 03:37:25.985748 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.101372 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-public-tls-certs\") pod \"30bb330d-d600-4d0f-86e4-edd3da24bc92\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.101472 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30bb330d-d600-4d0f-86e4-edd3da24bc92-logs\") pod \"30bb330d-d600-4d0f-86e4-edd3da24bc92\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.101503 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-internal-tls-certs\") pod \"30bb330d-d600-4d0f-86e4-edd3da24bc92\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.101547 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-combined-ca-bundle\") pod \"30bb330d-d600-4d0f-86e4-edd3da24bc92\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.101598 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vj2w\" (UniqueName: \"kubernetes.io/projected/30bb330d-d600-4d0f-86e4-edd3da24bc92-kube-api-access-9vj2w\") pod \"30bb330d-d600-4d0f-86e4-edd3da24bc92\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.101643 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-config-data\") pod \"30bb330d-d600-4d0f-86e4-edd3da24bc92\" (UID: \"30bb330d-d600-4d0f-86e4-edd3da24bc92\") " Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.101859 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30bb330d-d600-4d0f-86e4-edd3da24bc92-logs" (OuterVolumeSpecName: "logs") pod "30bb330d-d600-4d0f-86e4-edd3da24bc92" (UID: "30bb330d-d600-4d0f-86e4-edd3da24bc92"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.102098 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30bb330d-d600-4d0f-86e4-edd3da24bc92-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.106772 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30bb330d-d600-4d0f-86e4-edd3da24bc92-kube-api-access-9vj2w" (OuterVolumeSpecName: "kube-api-access-9vj2w") pod "30bb330d-d600-4d0f-86e4-edd3da24bc92" (UID: "30bb330d-d600-4d0f-86e4-edd3da24bc92"). InnerVolumeSpecName "kube-api-access-9vj2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.129183 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30bb330d-d600-4d0f-86e4-edd3da24bc92" (UID: "30bb330d-d600-4d0f-86e4-edd3da24bc92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.135726 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-config-data" (OuterVolumeSpecName: "config-data") pod "30bb330d-d600-4d0f-86e4-edd3da24bc92" (UID: "30bb330d-d600-4d0f-86e4-edd3da24bc92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.157347 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30bb330d-d600-4d0f-86e4-edd3da24bc92" (UID: "30bb330d-d600-4d0f-86e4-edd3da24bc92"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.159330 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30bb330d-d600-4d0f-86e4-edd3da24bc92" (UID: "30bb330d-d600-4d0f-86e4-edd3da24bc92"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.204783 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.204840 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.204855 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.204864 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vj2w\" (UniqueName: \"kubernetes.io/projected/30bb330d-d600-4d0f-86e4-edd3da24bc92-kube-api-access-9vj2w\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.204896 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bb330d-d600-4d0f-86e4-edd3da24bc92-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.243960 4766 generic.go:334] "Generic (PLEG): container finished" podID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerID="c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae" exitCode=143 Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.244027 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95fe51de-b5c0-465d-81e1-7ad319e75a84","Type":"ContainerDied","Data":"c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae"} Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.246086 4766 generic.go:334] "Generic (PLEG): container finished" podID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerID="6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055" exitCode=0 Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.246115 4766 generic.go:334] "Generic (PLEG): container finished" podID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerID="7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119" exitCode=143 Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.246132 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.246135 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30bb330d-d600-4d0f-86e4-edd3da24bc92","Type":"ContainerDied","Data":"6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055"} Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.246264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30bb330d-d600-4d0f-86e4-edd3da24bc92","Type":"ContainerDied","Data":"7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119"} Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.246275 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30bb330d-d600-4d0f-86e4-edd3da24bc92","Type":"ContainerDied","Data":"4d041b27280edf1b35d0f6818927d068f9d6303baa447becf15418c6afce455d"} Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.246292 4766 scope.go:117] "RemoveContainer" containerID="6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.273132 4766 scope.go:117] "RemoveContainer" containerID="7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.284592 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.295919 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.306097 4766 scope.go:117] "RemoveContainer" containerID="6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055" Dec 09 03:37:26 crc kubenswrapper[4766]: E1209 03:37:26.306549 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055\": container with ID starting with 6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055 not found: ID does not exist" containerID="6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.306577 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055"} err="failed to get container status \"6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055\": rpc error: code = NotFound desc = could not find container \"6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055\": container with ID starting with 6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055 not found: ID does not exist" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.306596 4766 scope.go:117] "RemoveContainer" containerID="7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119" Dec 09 03:37:26 crc kubenswrapper[4766]: E1209 03:37:26.306852 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119\": container with ID starting with 7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119 not found: ID does not exist" containerID="7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.306870 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119"} err="failed to get container status \"7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119\": rpc error: code = NotFound desc = could not find container \"7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119\": container with ID starting with 7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119 not found: ID does not exist" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.306882 4766 scope.go:117] "RemoveContainer" containerID="6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.307058 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055"} err="failed to get container status \"6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055\": rpc error: code = NotFound desc = could not find container \"6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055\": container with ID starting with 6d39b791743e6fdd44b0ba76c4b7d8f4fb452bc07a7499566efad112fe468055 not found: ID does not exist" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.307073 4766 scope.go:117] "RemoveContainer" containerID="7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.307262 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119"} err="failed to get container status \"7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119\": rpc error: code = NotFound desc = could not find container \"7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119\": container with ID starting with 7e89b232f53e86b03e20917a70ca56fe450b81caf69d502fb3fd93fff4a67119 not found: ID does not exist" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.307698 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:26 crc kubenswrapper[4766]: E1209 03:37:26.308177 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerName="nova-api-api" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.308201 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerName="nova-api-api" Dec 09 03:37:26 crc kubenswrapper[4766]: E1209 03:37:26.308240 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerName="nova-api-log" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.308250 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerName="nova-api-log" Dec 09 03:37:26 crc kubenswrapper[4766]: E1209 03:37:26.308270 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" containerName="dnsmasq-dns" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.308278 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" containerName="dnsmasq-dns" Dec 09 03:37:26 crc kubenswrapper[4766]: E1209 03:37:26.308295 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" containerName="init" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.308303 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" containerName="init" Dec 09 03:37:26 crc kubenswrapper[4766]: E1209 03:37:26.308329 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a559b56-2a81-4f1f-b42a-c46550710c47" containerName="nova-manage" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.308338 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a559b56-2a81-4f1f-b42a-c46550710c47" containerName="nova-manage" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.308566 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a559b56-2a81-4f1f-b42a-c46550710c47" containerName="nova-manage" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.308598 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerName="nova-api-log" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.308618 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="30bb330d-d600-4d0f-86e4-edd3da24bc92" containerName="nova-api-api" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.308641 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd933eb5-5f13-4e72-910c-aa495cfae9f3" containerName="dnsmasq-dns" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.309935 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.314458 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.314731 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.315236 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.316420 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.409458 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.409506 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99d3047-bb16-4bbe-a77d-0f4199121e7d-logs\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.409542 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-config-data\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.409587 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-public-tls-certs\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.409866 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnbsl\" (UniqueName: \"kubernetes.io/projected/c99d3047-bb16-4bbe-a77d-0f4199121e7d-kube-api-access-tnbsl\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.409954 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.512273 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-config-data\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.512670 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-public-tls-certs\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.512758 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnbsl\" (UniqueName: \"kubernetes.io/projected/c99d3047-bb16-4bbe-a77d-0f4199121e7d-kube-api-access-tnbsl\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.512780 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.512848 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.512876 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99d3047-bb16-4bbe-a77d-0f4199121e7d-logs\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.513247 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99d3047-bb16-4bbe-a77d-0f4199121e7d-logs\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.516871 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.516902 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-config-data\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.516915 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.517489 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-public-tls-certs\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.534418 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnbsl\" (UniqueName: \"kubernetes.io/projected/c99d3047-bb16-4bbe-a77d-0f4199121e7d-kube-api-access-tnbsl\") pod \"nova-api-0\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.647617 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:37:26 crc kubenswrapper[4766]: I1209 03:37:26.851435 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30bb330d-d600-4d0f-86e4-edd3da24bc92" path="/var/lib/kubelet/pods/30bb330d-d600-4d0f-86e4-edd3da24bc92/volumes" Dec 09 03:37:27 crc kubenswrapper[4766]: E1209 03:37:27.075407 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 03:37:27 crc kubenswrapper[4766]: E1209 03:37:27.076872 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 03:37:27 crc kubenswrapper[4766]: E1209 03:37:27.079552 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 03:37:27 crc kubenswrapper[4766]: E1209 03:37:27.079597 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="485630d7-5fe3-4f68-a448-d12f6a7ba6b0" containerName="nova-scheduler-scheduler" Dec 09 03:37:27 crc kubenswrapper[4766]: I1209 03:37:27.103682 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:37:27 crc kubenswrapper[4766]: I1209 03:37:27.257397 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99d3047-bb16-4bbe-a77d-0f4199121e7d","Type":"ContainerStarted","Data":"93406039b39958ffc38ff9e1d5f4d92e52d9e962927f8aaf32a6f727c5db1c57"} Dec 09 03:37:28 crc kubenswrapper[4766]: I1209 03:37:28.276798 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99d3047-bb16-4bbe-a77d-0f4199121e7d","Type":"ContainerStarted","Data":"4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16"} Dec 09 03:37:28 crc kubenswrapper[4766]: I1209 03:37:28.277946 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99d3047-bb16-4bbe-a77d-0f4199121e7d","Type":"ContainerStarted","Data":"73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23"} Dec 09 03:37:28 crc kubenswrapper[4766]: I1209 03:37:28.305786 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.305753399 podStartE2EDuration="2.305753399s" podCreationTimestamp="2025-12-09 03:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:37:28.304553917 +0000 UTC m=+1530.013859403" watchObservedRunningTime="2025-12-09 03:37:28.305753399 +0000 UTC m=+1530.015058865" Dec 09 03:37:28 crc kubenswrapper[4766]: I1209 03:37:28.633077 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": read tcp 10.217.0.2:39198->10.217.0.188:8775: read: connection reset by peer" Dec 09 03:37:28 crc kubenswrapper[4766]: I1209 03:37:28.633715 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": read tcp 10.217.0.2:39212->10.217.0.188:8775: read: connection reset by peer" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.160289 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.268697 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-config-data\") pod \"95fe51de-b5c0-465d-81e1-7ad319e75a84\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.269222 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fe51de-b5c0-465d-81e1-7ad319e75a84-logs\") pod \"95fe51de-b5c0-465d-81e1-7ad319e75a84\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.269288 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-combined-ca-bundle\") pod \"95fe51de-b5c0-465d-81e1-7ad319e75a84\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.269314 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vcsf\" (UniqueName: \"kubernetes.io/projected/95fe51de-b5c0-465d-81e1-7ad319e75a84-kube-api-access-4vcsf\") pod \"95fe51de-b5c0-465d-81e1-7ad319e75a84\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.269364 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-nova-metadata-tls-certs\") pod \"95fe51de-b5c0-465d-81e1-7ad319e75a84\" (UID: \"95fe51de-b5c0-465d-81e1-7ad319e75a84\") " Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.269902 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95fe51de-b5c0-465d-81e1-7ad319e75a84-logs" (OuterVolumeSpecName: "logs") pod "95fe51de-b5c0-465d-81e1-7ad319e75a84" (UID: "95fe51de-b5c0-465d-81e1-7ad319e75a84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.280468 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95fe51de-b5c0-465d-81e1-7ad319e75a84-kube-api-access-4vcsf" (OuterVolumeSpecName: "kube-api-access-4vcsf") pod "95fe51de-b5c0-465d-81e1-7ad319e75a84" (UID: "95fe51de-b5c0-465d-81e1-7ad319e75a84"). InnerVolumeSpecName "kube-api-access-4vcsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.303777 4766 generic.go:334] "Generic (PLEG): container finished" podID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerID="e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a" exitCode=0 Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.304598 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95fe51de-b5c0-465d-81e1-7ad319e75a84","Type":"ContainerDied","Data":"e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a"} Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.304600 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.304637 4766 scope.go:117] "RemoveContainer" containerID="e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.304626 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95fe51de-b5c0-465d-81e1-7ad319e75a84","Type":"ContainerDied","Data":"12c60facbfe0f40f165fa12ca257f5354d95d3f2c2930026341eb91185852bde"} Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.320846 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95fe51de-b5c0-465d-81e1-7ad319e75a84" (UID: "95fe51de-b5c0-465d-81e1-7ad319e75a84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.333803 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-config-data" (OuterVolumeSpecName: "config-data") pod "95fe51de-b5c0-465d-81e1-7ad319e75a84" (UID: "95fe51de-b5c0-465d-81e1-7ad319e75a84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.369325 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "95fe51de-b5c0-465d-81e1-7ad319e75a84" (UID: "95fe51de-b5c0-465d-81e1-7ad319e75a84"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.371859 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fe51de-b5c0-465d-81e1-7ad319e75a84-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.371899 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.371913 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vcsf\" (UniqueName: \"kubernetes.io/projected/95fe51de-b5c0-465d-81e1-7ad319e75a84-kube-api-access-4vcsf\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.371925 4766 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.371935 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fe51de-b5c0-465d-81e1-7ad319e75a84-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.389647 4766 scope.go:117] "RemoveContainer" containerID="c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.420946 4766 scope.go:117] "RemoveContainer" containerID="e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a" Dec 09 03:37:29 crc kubenswrapper[4766]: E1209 03:37:29.428735 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a\": container with ID starting with e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a not found: ID does not exist" containerID="e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.428789 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a"} err="failed to get container status \"e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a\": rpc error: code = NotFound desc = could not find container \"e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a\": container with ID starting with e41cce4d77281d7176a74e0c5911482d2e7714c82d335484d1469ef7a626172a not found: ID does not exist" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.428820 4766 scope.go:117] "RemoveContainer" containerID="c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae" Dec 09 03:37:29 crc kubenswrapper[4766]: E1209 03:37:29.432148 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae\": container with ID starting with c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae not found: ID does not exist" containerID="c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.432197 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae"} err="failed to get container status \"c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae\": rpc error: code = NotFound desc = could not find container \"c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae\": container with ID starting with c8ecbaa7bc0c274ec88d085613cd3b0c9c07e092106c83fb2612bd03816dcaae not found: ID does not exist" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.635104 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.646143 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.657734 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:37:29 crc kubenswrapper[4766]: E1209 03:37:29.658305 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-metadata" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.658332 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-metadata" Dec 09 03:37:29 crc kubenswrapper[4766]: E1209 03:37:29.658362 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-log" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.658372 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-log" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.658597 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-metadata" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.658628 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" containerName="nova-metadata-log" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.659886 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.661654 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.662327 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.667683 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.779328 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.779376 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-config-data\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.779397 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ece85ffc-6754-4f25-a66c-cf66043196b3-logs\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.779419 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.779595 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc7nh\" (UniqueName: \"kubernetes.io/projected/ece85ffc-6754-4f25-a66c-cf66043196b3-kube-api-access-kc7nh\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.881145 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7nh\" (UniqueName: \"kubernetes.io/projected/ece85ffc-6754-4f25-a66c-cf66043196b3-kube-api-access-kc7nh\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.881433 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.881509 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-config-data\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.881538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ece85ffc-6754-4f25-a66c-cf66043196b3-logs\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.881588 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.882121 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ece85ffc-6754-4f25-a66c-cf66043196b3-logs\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.893102 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.893586 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.893815 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-config-data\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.895464 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc7nh\" (UniqueName: \"kubernetes.io/projected/ece85ffc-6754-4f25-a66c-cf66043196b3-kube-api-access-kc7nh\") pod \"nova-metadata-0\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " pod="openstack/nova-metadata-0" Dec 09 03:37:29 crc kubenswrapper[4766]: I1209 03:37:29.980051 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:37:30 crc kubenswrapper[4766]: I1209 03:37:30.462058 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:37:30 crc kubenswrapper[4766]: W1209 03:37:30.470495 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podece85ffc_6754_4f25_a66c_cf66043196b3.slice/crio-5ff8aa01f81516bcfd5c50cf48e8ed6aaaa06e7a08533a34c773a04bba76d28b WatchSource:0}: Error finding container 5ff8aa01f81516bcfd5c50cf48e8ed6aaaa06e7a08533a34c773a04bba76d28b: Status 404 returned error can't find the container with id 5ff8aa01f81516bcfd5c50cf48e8ed6aaaa06e7a08533a34c773a04bba76d28b Dec 09 03:37:30 crc kubenswrapper[4766]: I1209 03:37:30.850272 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95fe51de-b5c0-465d-81e1-7ad319e75a84" path="/var/lib/kubelet/pods/95fe51de-b5c0-465d-81e1-7ad319e75a84/volumes" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.307242 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.326550 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ece85ffc-6754-4f25-a66c-cf66043196b3","Type":"ContainerStarted","Data":"8f530fae2ed519e55e5f27f969322808cde1ee901eca5360c959594114369789"} Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.326876 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ece85ffc-6754-4f25-a66c-cf66043196b3","Type":"ContainerStarted","Data":"33f5f8cb257a0368fe88133631597afd0d390af9af8575a7cd312d29bca8a767"} Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.326924 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ece85ffc-6754-4f25-a66c-cf66043196b3","Type":"ContainerStarted","Data":"5ff8aa01f81516bcfd5c50cf48e8ed6aaaa06e7a08533a34c773a04bba76d28b"} Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.327837 4766 generic.go:334] "Generic (PLEG): container finished" podID="485630d7-5fe3-4f68-a448-d12f6a7ba6b0" containerID="39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427" exitCode=0 Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.327872 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"485630d7-5fe3-4f68-a448-d12f6a7ba6b0","Type":"ContainerDied","Data":"39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427"} Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.327878 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.327892 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"485630d7-5fe3-4f68-a448-d12f6a7ba6b0","Type":"ContainerDied","Data":"adc731ac9cdbd132ab466b0c36f3b9f827c4aca2f7f2483a51ba2951988b2d61"} Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.327909 4766 scope.go:117] "RemoveContainer" containerID="39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.357263 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.3572435880000002 podStartE2EDuration="2.357243588s" podCreationTimestamp="2025-12-09 03:37:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:37:31.352607644 +0000 UTC m=+1533.061913080" watchObservedRunningTime="2025-12-09 03:37:31.357243588 +0000 UTC m=+1533.066549014" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.363732 4766 scope.go:117] "RemoveContainer" containerID="39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427" Dec 09 03:37:31 crc kubenswrapper[4766]: E1209 03:37:31.364934 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427\": container with ID starting with 39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427 not found: ID does not exist" containerID="39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.365111 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427"} err="failed to get container status \"39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427\": rpc error: code = NotFound desc = could not find container \"39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427\": container with ID starting with 39408fbc32222f91fecc1a67bb979ef35e92ecd75ce733f17cd4c8d7c0d80427 not found: ID does not exist" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.424551 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-config-data\") pod \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.424779 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-combined-ca-bundle\") pod \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.424875 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wntwl\" (UniqueName: \"kubernetes.io/projected/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-kube-api-access-wntwl\") pod \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\" (UID: \"485630d7-5fe3-4f68-a448-d12f6a7ba6b0\") " Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.431048 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-kube-api-access-wntwl" (OuterVolumeSpecName: "kube-api-access-wntwl") pod "485630d7-5fe3-4f68-a448-d12f6a7ba6b0" (UID: "485630d7-5fe3-4f68-a448-d12f6a7ba6b0"). InnerVolumeSpecName "kube-api-access-wntwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.451168 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "485630d7-5fe3-4f68-a448-d12f6a7ba6b0" (UID: "485630d7-5fe3-4f68-a448-d12f6a7ba6b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.453732 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-config-data" (OuterVolumeSpecName: "config-data") pod "485630d7-5fe3-4f68-a448-d12f6a7ba6b0" (UID: "485630d7-5fe3-4f68-a448-d12f6a7ba6b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.527858 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.527890 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.527900 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wntwl\" (UniqueName: \"kubernetes.io/projected/485630d7-5fe3-4f68-a448-d12f6a7ba6b0-kube-api-access-wntwl\") on node \"crc\" DevicePath \"\"" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.656119 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.666374 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.679790 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:37:31 crc kubenswrapper[4766]: E1209 03:37:31.680728 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485630d7-5fe3-4f68-a448-d12f6a7ba6b0" containerName="nova-scheduler-scheduler" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.680753 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="485630d7-5fe3-4f68-a448-d12f6a7ba6b0" containerName="nova-scheduler-scheduler" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.681025 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="485630d7-5fe3-4f68-a448-d12f6a7ba6b0" containerName="nova-scheduler-scheduler" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.681940 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.685681 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.713273 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.832786 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jnx7\" (UniqueName: \"kubernetes.io/projected/507e513c-d987-4fba-8fc0-e5ceff892afe-kube-api-access-4jnx7\") pod \"nova-scheduler-0\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.833068 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.833150 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-config-data\") pod \"nova-scheduler-0\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.935083 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.935147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-config-data\") pod \"nova-scheduler-0\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.935255 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jnx7\" (UniqueName: \"kubernetes.io/projected/507e513c-d987-4fba-8fc0-e5ceff892afe-kube-api-access-4jnx7\") pod \"nova-scheduler-0\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.939451 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-config-data\") pod \"nova-scheduler-0\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.940336 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " pod="openstack/nova-scheduler-0" Dec 09 03:37:31 crc kubenswrapper[4766]: I1209 03:37:31.954397 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jnx7\" (UniqueName: \"kubernetes.io/projected/507e513c-d987-4fba-8fc0-e5ceff892afe-kube-api-access-4jnx7\") pod \"nova-scheduler-0\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " pod="openstack/nova-scheduler-0" Dec 09 03:37:32 crc kubenswrapper[4766]: I1209 03:37:32.057263 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:37:32 crc kubenswrapper[4766]: W1209 03:37:32.509399 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod507e513c_d987_4fba_8fc0_e5ceff892afe.slice/crio-4c403ef6fd2f9ac5cfbdba2eb7dc37395e124f01c1e4be74d23567bd872ea6e2 WatchSource:0}: Error finding container 4c403ef6fd2f9ac5cfbdba2eb7dc37395e124f01c1e4be74d23567bd872ea6e2: Status 404 returned error can't find the container with id 4c403ef6fd2f9ac5cfbdba2eb7dc37395e124f01c1e4be74d23567bd872ea6e2 Dec 09 03:37:32 crc kubenswrapper[4766]: I1209 03:37:32.510906 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:37:32 crc kubenswrapper[4766]: I1209 03:37:32.841390 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:37:32 crc kubenswrapper[4766]: E1209 03:37:32.841899 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:37:32 crc kubenswrapper[4766]: I1209 03:37:32.855946 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485630d7-5fe3-4f68-a448-d12f6a7ba6b0" path="/var/lib/kubelet/pods/485630d7-5fe3-4f68-a448-d12f6a7ba6b0/volumes" Dec 09 03:37:33 crc kubenswrapper[4766]: I1209 03:37:33.358761 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"507e513c-d987-4fba-8fc0-e5ceff892afe","Type":"ContainerStarted","Data":"a067a547f7da427c71cff18393daf6fa745ddd3b809d72ffbb267a1fb541cd92"} Dec 09 03:37:33 crc kubenswrapper[4766]: I1209 03:37:33.358832 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"507e513c-d987-4fba-8fc0-e5ceff892afe","Type":"ContainerStarted","Data":"4c403ef6fd2f9ac5cfbdba2eb7dc37395e124f01c1e4be74d23567bd872ea6e2"} Dec 09 03:37:33 crc kubenswrapper[4766]: I1209 03:37:33.386659 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.386638036 podStartE2EDuration="2.386638036s" podCreationTimestamp="2025-12-09 03:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:37:33.376074581 +0000 UTC m=+1535.085380027" watchObservedRunningTime="2025-12-09 03:37:33.386638036 +0000 UTC m=+1535.095943472" Dec 09 03:37:34 crc kubenswrapper[4766]: I1209 03:37:34.981084 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 03:37:34 crc kubenswrapper[4766]: I1209 03:37:34.982467 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 03:37:36 crc kubenswrapper[4766]: I1209 03:37:36.648612 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 03:37:36 crc kubenswrapper[4766]: I1209 03:37:36.648981 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 03:37:37 crc kubenswrapper[4766]: I1209 03:37:37.057849 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 03:37:37 crc kubenswrapper[4766]: I1209 03:37:37.663408 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 03:37:37 crc kubenswrapper[4766]: I1209 03:37:37.663444 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 03:37:39 crc kubenswrapper[4766]: I1209 03:37:39.981295 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 03:37:39 crc kubenswrapper[4766]: I1209 03:37:39.981732 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 03:37:40 crc kubenswrapper[4766]: I1209 03:37:40.982288 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 03:37:40 crc kubenswrapper[4766]: I1209 03:37:40.989688 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 09 03:37:42 crc kubenswrapper[4766]: I1209 03:37:42.058108 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 03:37:42 crc kubenswrapper[4766]: I1209 03:37:42.084287 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 03:37:42 crc kubenswrapper[4766]: I1209 03:37:42.592979 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 03:37:46 crc kubenswrapper[4766]: I1209 03:37:46.550025 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 03:37:46 crc kubenswrapper[4766]: I1209 03:37:46.655520 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 03:37:46 crc kubenswrapper[4766]: I1209 03:37:46.656265 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 03:37:46 crc kubenswrapper[4766]: I1209 03:37:46.656376 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 03:37:46 crc kubenswrapper[4766]: I1209 03:37:46.663894 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 03:37:47 crc kubenswrapper[4766]: I1209 03:37:47.595828 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 03:37:47 crc kubenswrapper[4766]: I1209 03:37:47.606056 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 03:37:47 crc kubenswrapper[4766]: I1209 03:37:47.839385 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:37:47 crc kubenswrapper[4766]: E1209 03:37:47.839625 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:37:49 crc kubenswrapper[4766]: I1209 03:37:49.987341 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 03:37:49 crc kubenswrapper[4766]: I1209 03:37:49.987801 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 03:37:49 crc kubenswrapper[4766]: I1209 03:37:49.995419 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 03:37:49 crc kubenswrapper[4766]: I1209 03:37:49.999446 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 03:37:59 crc kubenswrapper[4766]: I1209 03:37:59.839786 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:37:59 crc kubenswrapper[4766]: E1209 03:37:59.840730 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:38:10 crc kubenswrapper[4766]: I1209 03:38:10.807626 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 09 03:38:10 crc kubenswrapper[4766]: I1209 03:38:10.816865 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b121d489-4c67-4106-aa17-ec66f896ba25" containerName="openstackclient" containerID="cri-o://1454c674a2602e750c4beddf30cf919ff742bdcc3255241923700f7fa0a09ef8" gracePeriod=2 Dec 09 03:38:10 crc kubenswrapper[4766]: I1209 03:38:10.818107 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 09 03:38:10 crc kubenswrapper[4766]: I1209 03:38:10.892378 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 03:38:10 crc kubenswrapper[4766]: I1209 03:38:10.950019 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 03:38:10 crc kubenswrapper[4766]: I1209 03:38:10.950274 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="ovn-northd" containerID="cri-o://24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333" gracePeriod=30 Dec 09 03:38:10 crc kubenswrapper[4766]: I1209 03:38:10.950662 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="openstack-network-exporter" containerID="cri-o://bd3d08a30c68e8b92264c99f7fd188c185cb8959a30c11e9fd15099288bfbc41" gracePeriod=30 Dec 09 03:38:10 crc kubenswrapper[4766]: I1209 03:38:10.987737 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-bkt59"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.027530 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-bkt59"] Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.043767 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.058356 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.077619 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.077698 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="ovn-northd" Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.080308 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.080361 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data podName:3af438c1-d0b9-4ecb-bb88-a0efd14736a4 nodeName:}" failed. No retries permitted until 2025-12-09 03:38:11.580347399 +0000 UTC m=+1573.289652825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data") pod "rabbitmq-cell1-server-0" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4") : configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.115055 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance9417-account-delete-lg65s"] Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.124130 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b121d489-4c67-4106-aa17-ec66f896ba25" containerName="openstackclient" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.124173 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b121d489-4c67-4106-aa17-ec66f896ba25" containerName="openstackclient" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.124422 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b121d489-4c67-4106-aa17-ec66f896ba25" containerName="openstackclient" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.125050 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.208304 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicand6a3-account-delete-cjwdk"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.209777 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.277189 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance9417-account-delete-lg65s"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.289242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f5be126-5890-4cef-aa82-3bdeef1918cd-operator-scripts\") pod \"barbicand6a3-account-delete-cjwdk\" (UID: \"8f5be126-5890-4cef-aa82-3bdeef1918cd\") " pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.289314 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slrvb\" (UniqueName: \"kubernetes.io/projected/8f5be126-5890-4cef-aa82-3bdeef1918cd-kube-api-access-slrvb\") pod \"barbicand6a3-account-delete-cjwdk\" (UID: \"8f5be126-5890-4cef-aa82-3bdeef1918cd\") " pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.289337 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f227ef39-ddef-411a-96b3-96871679cae1-operator-scripts\") pod \"glance9417-account-delete-lg65s\" (UID: \"f227ef39-ddef-411a-96b3-96871679cae1\") " pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.289395 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl5c9\" (UniqueName: \"kubernetes.io/projected/f227ef39-ddef-411a-96b3-96871679cae1-kube-api-access-bl5c9\") pod \"glance9417-account-delete-lg65s\" (UID: \"f227ef39-ddef-411a-96b3-96871679cae1\") " pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.292287 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicand6a3-account-delete-cjwdk"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.324505 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement0abc-account-delete-9k2gp"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.330808 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.352954 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement0abc-account-delete-9k2gp"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.369767 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-d2mrs"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.370019 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-d2mrs" podUID="1554173f-b66c-43d5-a5e4-cd10a81f09d4" containerName="openstack-network-exporter" containerID="cri-o://ea40d431b90f6866f082e789a7c9306a894b603e172fe4f549ac16e4a66c9f58" gracePeriod=30 Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.390744 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl5c9\" (UniqueName: \"kubernetes.io/projected/f227ef39-ddef-411a-96b3-96871679cae1-kube-api-access-bl5c9\") pod \"glance9417-account-delete-lg65s\" (UID: \"f227ef39-ddef-411a-96b3-96871679cae1\") " pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.390910 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f5be126-5890-4cef-aa82-3bdeef1918cd-operator-scripts\") pod \"barbicand6a3-account-delete-cjwdk\" (UID: \"8f5be126-5890-4cef-aa82-3bdeef1918cd\") " pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.390993 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slrvb\" (UniqueName: \"kubernetes.io/projected/8f5be126-5890-4cef-aa82-3bdeef1918cd-kube-api-access-slrvb\") pod \"barbicand6a3-account-delete-cjwdk\" (UID: \"8f5be126-5890-4cef-aa82-3bdeef1918cd\") " pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.391030 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f227ef39-ddef-411a-96b3-96871679cae1-operator-scripts\") pod \"glance9417-account-delete-lg65s\" (UID: \"f227ef39-ddef-411a-96b3-96871679cae1\") " pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.391910 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f227ef39-ddef-411a-96b3-96871679cae1-operator-scripts\") pod \"glance9417-account-delete-lg65s\" (UID: \"f227ef39-ddef-411a-96b3-96871679cae1\") " pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.392791 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f5be126-5890-4cef-aa82-3bdeef1918cd-operator-scripts\") pod \"barbicand6a3-account-delete-cjwdk\" (UID: \"8f5be126-5890-4cef-aa82-3bdeef1918cd\") " pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.447400 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-h9kbs"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.455930 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl5c9\" (UniqueName: \"kubernetes.io/projected/f227ef39-ddef-411a-96b3-96871679cae1-kube-api-access-bl5c9\") pod \"glance9417-account-delete-lg65s\" (UID: \"f227ef39-ddef-411a-96b3-96871679cae1\") " pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.459705 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slrvb\" (UniqueName: \"kubernetes.io/projected/8f5be126-5890-4cef-aa82-3bdeef1918cd-kube-api-access-slrvb\") pod \"barbicand6a3-account-delete-cjwdk\" (UID: \"8f5be126-5890-4cef-aa82-3bdeef1918cd\") " pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.468693 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell00dd1-account-delete-flgqj"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.470050 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.493750 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a764f29a-d427-43a5-833f-34b2064c122d-operator-scripts\") pod \"placement0abc-account-delete-9k2gp\" (UID: \"a764f29a-d427-43a5-833f-34b2064c122d\") " pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.493852 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cbl8\" (UniqueName: \"kubernetes.io/projected/a764f29a-d427-43a5-833f-34b2064c122d-kube-api-access-2cbl8\") pod \"placement0abc-account-delete-9k2gp\" (UID: \"a764f29a-d427-43a5-833f-34b2064c122d\") " pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.510152 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell00dd1-account-delete-flgqj"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.511637 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.584947 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s25fz"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.590003 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.600976 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a764f29a-d427-43a5-833f-34b2064c122d-operator-scripts\") pod \"placement0abc-account-delete-9k2gp\" (UID: \"a764f29a-d427-43a5-833f-34b2064c122d\") " pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.613789 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cbl8\" (UniqueName: \"kubernetes.io/projected/a764f29a-d427-43a5-833f-34b2064c122d-kube-api-access-2cbl8\") pod \"placement0abc-account-delete-9k2gp\" (UID: \"a764f29a-d427-43a5-833f-34b2064c122d\") " pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.613865 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v95q\" (UniqueName: \"kubernetes.io/projected/8b67744f-1b24-4baf-b397-26cff83c2a4d-kube-api-access-2v95q\") pod \"novacell00dd1-account-delete-flgqj\" (UID: \"8b67744f-1b24-4baf-b397-26cff83c2a4d\") " pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.614080 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts\") pod \"novacell00dd1-account-delete-flgqj\" (UID: \"8b67744f-1b24-4baf-b397-26cff83c2a4d\") " pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.604694 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a764f29a-d427-43a5-833f-34b2064c122d-operator-scripts\") pod \"placement0abc-account-delete-9k2gp\" (UID: \"a764f29a-d427-43a5-833f-34b2064c122d\") " pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.614579 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.614620 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data podName:3af438c1-d0b9-4ecb-bb88-a0efd14736a4 nodeName:}" failed. No retries permitted until 2025-12-09 03:38:12.614606216 +0000 UTC m=+1574.323911642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data") pod "rabbitmq-cell1-server-0" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4") : configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.668306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cbl8\" (UniqueName: \"kubernetes.io/projected/a764f29a-d427-43a5-833f-34b2064c122d-kube-api-access-2cbl8\") pod \"placement0abc-account-delete-9k2gp\" (UID: \"a764f29a-d427-43a5-833f-34b2064c122d\") " pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.702629 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.731144 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts\") pod \"novacell00dd1-account-delete-flgqj\" (UID: \"8b67744f-1b24-4baf-b397-26cff83c2a4d\") " pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.731425 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v95q\" (UniqueName: \"kubernetes.io/projected/8b67744f-1b24-4baf-b397-26cff83c2a4d-kube-api-access-2v95q\") pod \"novacell00dd1-account-delete-flgqj\" (UID: \"8b67744f-1b24-4baf-b397-26cff83c2a4d\") " pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.732677 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts\") pod \"novacell00dd1-account-delete-flgqj\" (UID: \"8b67744f-1b24-4baf-b397-26cff83c2a4d\") " pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.768602 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v95q\" (UniqueName: \"kubernetes.io/projected/8b67744f-1b24-4baf-b397-26cff83c2a4d-kube-api-access-2v95q\") pod \"novacell00dd1-account-delete-flgqj\" (UID: \"8b67744f-1b24-4baf-b397-26cff83c2a4d\") " pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.832909 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-zf6gw"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.847692 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.871343 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-zf6gw"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.893080 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapibaee-account-delete-wvgpg"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.894706 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.947779 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapibaee-account-delete-wvgpg"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.967062 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinderda2b-account-delete-kmsx2"] Dec 09 03:38:11 crc kubenswrapper[4766]: I1209 03:38:11.968646 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.969854 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 03:38:11 crc kubenswrapper[4766]: E1209 03:38:11.969899 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data podName:48862672-08e2-4ac6-86a3-57d84bbc868d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:12.46988361 +0000 UTC m=+1574.179189036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data") pod "rabbitmq-server-0" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d") : configmap "rabbitmq-config-data" not found Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.033101 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutronc2af-account-delete-zxgmn"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.042648 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.054172 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronc2af-account-delete-zxgmn"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.063659 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.074010 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderda2b-account-delete-kmsx2"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.099280 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-97s8x"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.101994 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts\") pod \"novaapibaee-account-delete-wvgpg\" (UID: \"77345185-b7d7-46d4-9e72-251bac080f3a\") " pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.102023 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rnzx\" (UniqueName: \"kubernetes.io/projected/77345185-b7d7-46d4-9e72-251bac080f3a-kube-api-access-8rnzx\") pod \"novaapibaee-account-delete-wvgpg\" (UID: \"77345185-b7d7-46d4-9e72-251bac080f3a\") " pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.102053 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frvrg\" (UniqueName: \"kubernetes.io/projected/604402ee-2350-4cec-8a56-b5203c3287e8-kube-api-access-frvrg\") pod \"cinderda2b-account-delete-kmsx2\" (UID: \"604402ee-2350-4cec-8a56-b5203c3287e8\") " pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.102131 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts\") pod \"cinderda2b-account-delete-kmsx2\" (UID: \"604402ee-2350-4cec-8a56-b5203c3287e8\") " pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.118789 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-97s8x"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.120859 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d2mrs_1554173f-b66c-43d5-a5e4-cd10a81f09d4/openstack-network-exporter/0.log" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.120915 4766 generic.go:334] "Generic (PLEG): container finished" podID="1554173f-b66c-43d5-a5e4-cd10a81f09d4" containerID="ea40d431b90f6866f082e789a7c9306a894b603e172fe4f549ac16e4a66c9f58" exitCode=2 Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.120998 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d2mrs" event={"ID":"1554173f-b66c-43d5-a5e4-cd10a81f09d4","Type":"ContainerDied","Data":"ea40d431b90f6866f082e789a7c9306a894b603e172fe4f549ac16e4a66c9f58"} Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.162730 4766 generic.go:334] "Generic (PLEG): container finished" podID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerID="bd3d08a30c68e8b92264c99f7fd188c185cb8959a30c11e9fd15099288bfbc41" exitCode=2 Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.162767 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"45ff249a-854d-4c30-8216-b7bd9482e08c","Type":"ContainerDied","Data":"bd3d08a30c68e8b92264c99f7fd188c185cb8959a30c11e9fd15099288bfbc41"} Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.181519 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-j4snv"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.204375 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts\") pod \"cinderda2b-account-delete-kmsx2\" (UID: \"604402ee-2350-4cec-8a56-b5203c3287e8\") " pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.204524 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rnzx\" (UniqueName: \"kubernetes.io/projected/77345185-b7d7-46d4-9e72-251bac080f3a-kube-api-access-8rnzx\") pod \"novaapibaee-account-delete-wvgpg\" (UID: \"77345185-b7d7-46d4-9e72-251bac080f3a\") " pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.204545 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts\") pod \"novaapibaee-account-delete-wvgpg\" (UID: \"77345185-b7d7-46d4-9e72-251bac080f3a\") " pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.204570 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frvrg\" (UniqueName: \"kubernetes.io/projected/604402ee-2350-4cec-8a56-b5203c3287e8-kube-api-access-frvrg\") pod \"cinderda2b-account-delete-kmsx2\" (UID: \"604402ee-2350-4cec-8a56-b5203c3287e8\") " pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.204615 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d1796c-9c3d-444c-bda3-2a7525ac2650-operator-scripts\") pod \"neutronc2af-account-delete-zxgmn\" (UID: \"15d1796c-9c3d-444c-bda3-2a7525ac2650\") " pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.204637 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x67gk\" (UniqueName: \"kubernetes.io/projected/15d1796c-9c3d-444c-bda3-2a7525ac2650-kube-api-access-x67gk\") pod \"neutronc2af-account-delete-zxgmn\" (UID: \"15d1796c-9c3d-444c-bda3-2a7525ac2650\") " pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.205400 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts\") pod \"cinderda2b-account-delete-kmsx2\" (UID: \"604402ee-2350-4cec-8a56-b5203c3287e8\") " pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.206093 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts\") pod \"novaapibaee-account-delete-wvgpg\" (UID: \"77345185-b7d7-46d4-9e72-251bac080f3a\") " pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.279473 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-j4snv"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.295672 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rnzx\" (UniqueName: \"kubernetes.io/projected/77345185-b7d7-46d4-9e72-251bac080f3a-kube-api-access-8rnzx\") pod \"novaapibaee-account-delete-wvgpg\" (UID: \"77345185-b7d7-46d4-9e72-251bac080f3a\") " pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.317536 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d1796c-9c3d-444c-bda3-2a7525ac2650-operator-scripts\") pod \"neutronc2af-account-delete-zxgmn\" (UID: \"15d1796c-9c3d-444c-bda3-2a7525ac2650\") " pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.317584 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x67gk\" (UniqueName: \"kubernetes.io/projected/15d1796c-9c3d-444c-bda3-2a7525ac2650-kube-api-access-x67gk\") pod \"neutronc2af-account-delete-zxgmn\" (UID: \"15d1796c-9c3d-444c-bda3-2a7525ac2650\") " pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.318284 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d1796c-9c3d-444c-bda3-2a7525ac2650-operator-scripts\") pod \"neutronc2af-account-delete-zxgmn\" (UID: \"15d1796c-9c3d-444c-bda3-2a7525ac2650\") " pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:12 crc kubenswrapper[4766]: E1209 03:38:12.418840 4766 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-s25fz" message="Exiting ovn-controller (1) " Dec 09 03:38:12 crc kubenswrapper[4766]: E1209 03:38:12.418913 4766 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-s25fz" podUID="f28c984f-04eb-4398-af98-9e2c5e6afd13" containerName="ovn-controller" containerID="cri-o://675041bc225bfcd7e5dd3621e06519a2045ddfa0f1dfb2b19585bfe57589d512" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.418949 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-s25fz" podUID="f28c984f-04eb-4398-af98-9e2c5e6afd13" containerName="ovn-controller" containerID="cri-o://675041bc225bfcd7e5dd3621e06519a2045ddfa0f1dfb2b19585bfe57589d512" gracePeriod=30 Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.426356 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnrlz"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.432085 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x67gk\" (UniqueName: \"kubernetes.io/projected/15d1796c-9c3d-444c-bda3-2a7525ac2650-kube-api-access-x67gk\") pod \"neutronc2af-account-delete-zxgmn\" (UID: \"15d1796c-9c3d-444c-bda3-2a7525ac2650\") " pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.498391 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lnrlz"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.500327 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.525574 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-brzmt"] Dec 09 03:38:12 crc kubenswrapper[4766]: E1209 03:38:12.527416 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 03:38:12 crc kubenswrapper[4766]: E1209 03:38:12.527469 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data podName:48862672-08e2-4ac6-86a3-57d84bbc868d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:13.527454355 +0000 UTC m=+1575.236759781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data") pod "rabbitmq-server-0" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d") : configmap "rabbitmq-config-data" not found Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.567140 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.591564 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-brzmt"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.617300 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frvrg\" (UniqueName: \"kubernetes.io/projected/604402ee-2350-4cec-8a56-b5203c3287e8-kube-api-access-frvrg\") pod \"cinderda2b-account-delete-kmsx2\" (UID: \"604402ee-2350-4cec-8a56-b5203c3287e8\") " pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.619102 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d2mrs_1554173f-b66c-43d5-a5e4-cd10a81f09d4/openstack-network-exporter/0.log" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.619165 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:38:12 crc kubenswrapper[4766]: E1209 03:38:12.629136 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:12 crc kubenswrapper[4766]: E1209 03:38:12.629202 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data podName:3af438c1-d0b9-4ecb-bb88-a0efd14736a4 nodeName:}" failed. No retries permitted until 2025-12-09 03:38:14.629184206 +0000 UTC m=+1576.338489632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data") pod "rabbitmq-cell1-server-0" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4") : configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.642674 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-g8dqt"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.670019 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.709567 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-g8dqt"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.731342 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovn-rundir\") pod \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.731482 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-metrics-certs-tls-certs\") pod \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.731501 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1554173f-b66c-43d5-a5e4-cd10a81f09d4-config\") pod \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.731553 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lntv\" (UniqueName: \"kubernetes.io/projected/1554173f-b66c-43d5-a5e4-cd10a81f09d4-kube-api-access-4lntv\") pod \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.731578 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-combined-ca-bundle\") pod \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.731630 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovs-rundir\") pod \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\" (UID: \"1554173f-b66c-43d5-a5e4-cd10a81f09d4\") " Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.731933 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "1554173f-b66c-43d5-a5e4-cd10a81f09d4" (UID: "1554173f-b66c-43d5-a5e4-cd10a81f09d4"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.732091 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.732643 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1554173f-b66c-43d5-a5e4-cd10a81f09d4-config" (OuterVolumeSpecName: "config") pod "1554173f-b66c-43d5-a5e4-cd10a81f09d4" (UID: "1554173f-b66c-43d5-a5e4-cd10a81f09d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.732732 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "1554173f-b66c-43d5-a5e4-cd10a81f09d4" (UID: "1554173f-b66c-43d5-a5e4-cd10a81f09d4"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.740013 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-khb7n"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.740368 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" podUID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" containerName="dnsmasq-dns" containerID="cri-o://247c3eaf1f8095705dd3d8948a06084c404d9dd6d14b121c931d90dca158ef0e" gracePeriod=10 Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.754452 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1554173f-b66c-43d5-a5e4-cd10a81f09d4-kube-api-access-4lntv" (OuterVolumeSpecName: "kube-api-access-4lntv") pod "1554173f-b66c-43d5-a5e4-cd10a81f09d4" (UID: "1554173f-b66c-43d5-a5e4-cd10a81f09d4"). InnerVolumeSpecName "kube-api-access-4lntv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.802891 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.803139 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-log" containerID="cri-o://60568c7893411368ef7934b6d7f5d2360db7af5753dc4500e120b6309331f0b4" gracePeriod=30 Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.803315 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-httpd" containerID="cri-o://f77816b92b5912c6b1c7dc678c8c50b01d826733809db88019e22a9ee0865718" gracePeriod=30 Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.836191 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.836537 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1554173f-b66c-43d5-a5e4-cd10a81f09d4-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.836561 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lntv\" (UniqueName: \"kubernetes.io/projected/1554173f-b66c-43d5-a5e4-cd10a81f09d4-kube-api-access-4lntv\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.836572 4766 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1554173f-b66c-43d5-a5e4-cd10a81f09d4-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.837676 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="bf108478-7651-4f37-b0e7-3a571774d030" containerName="openstack-network-exporter" containerID="cri-o://eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033" gracePeriod=300 Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.844319 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:38:12 crc kubenswrapper[4766]: E1209 03:38:12.844841 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.874393 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1554173f-b66c-43d5-a5e4-cd10a81f09d4" (UID: "1554173f-b66c-43d5-a5e4-cd10a81f09d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.925179 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc" path="/var/lib/kubelet/pods/4e0a482b-a0c9-4bf6-9177-bf4cda0d47bc/volumes" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.953379 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f111f0-cbfb-4181-ad96-30f654851763" path="/var/lib/kubelet/pods/51f111f0-cbfb-4181-ad96-30f654851763/volumes" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.984287 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:12 crc kubenswrapper[4766]: I1209 03:38:12.987816 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="bf108478-7651-4f37-b0e7-3a571774d030" containerName="ovsdbserver-sb" containerID="cri-o://2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1" gracePeriod=300 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.014229 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d08c90c-fad3-42ae-8950-8d57a79f9654" path="/var/lib/kubelet/pods/6d08c90c-fad3-42ae-8950-8d57a79f9654/volumes" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.017083 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de191a4-7cf0-4999-a102-b96a06b2ba24" path="/var/lib/kubelet/pods/8de191a4-7cf0-4999-a102-b96a06b2ba24/volumes" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.017857 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a559b56-2a81-4f1f-b42a-c46550710c47" path="/var/lib/kubelet/pods/9a559b56-2a81-4f1f-b42a-c46550710c47/volumes" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.023385 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a" path="/var/lib/kubelet/pods/d303c2bd-2fb1-4592-9c5e-e4a2d8d43c1a/volumes" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.024264 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de52b533-348b-4fe0-b951-6dfe9c3f86e1" path="/var/lib/kubelet/pods/de52b533-348b-4fe0-b951-6dfe9c3f86e1/volumes" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.049859 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1554173f-b66c-43d5-a5e4-cd10a81f09d4" (UID: "1554173f-b66c-43d5-a5e4-cd10a81f09d4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.085737 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9f669bd74-mz8xh"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.085783 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.085795 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-hb6ct"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.085805 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-hb6ct"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.085820 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.086065 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-log" containerID="cri-o://4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.086356 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-9f669bd74-mz8xh" podUID="417726b2-75fd-4efc-84ec-803533df86aa" containerName="placement-log" containerID="cri-o://c3015d64e08f82a88af0129948492bc3c6ad857b57c9483bd887eeb673aa3dc3" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.086941 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-httpd" containerID="cri-o://d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.087140 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-9f669bd74-mz8xh" podUID="417726b2-75fd-4efc-84ec-803533df86aa" containerName="placement-api" containerID="cri-o://556527868dac8d175c6e11bca57d484310691f25ac252842802df8da12d3c8f3" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.087262 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerName="openstack-network-exporter" containerID="cri-o://16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754" gracePeriod=300 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.090194 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1554173f-b66c-43d5-a5e4-cd10a81f09d4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.131053 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132009 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-server" containerID="cri-o://ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132068 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-server" containerID="cri-o://bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132336 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-auditor" containerID="cri-o://0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132419 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-replicator" containerID="cri-o://c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132430 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-updater" containerID="cri-o://071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132190 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-server" containerID="cri-o://c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132200 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-reaper" containerID="cri-o://e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132230 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-auditor" containerID="cri-o://6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132241 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-replicator" containerID="cri-o://b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132162 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-auditor" containerID="cri-o://1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132140 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-updater" containerID="cri-o://6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132581 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="swift-recon-cron" containerID="cri-o://3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132174 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-replicator" containerID="cri-o://d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132675 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="rsync" containerID="cri-o://3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.132289 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-expirer" containerID="cri-o://e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.162907 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.210101 4766 generic.go:334] "Generic (PLEG): container finished" podID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerID="60568c7893411368ef7934b6d7f5d2360db7af5753dc4500e120b6309331f0b4" exitCode=143 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.210418 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9edd6e7b-9841-43be-9478-5e7d06d8bd8d","Type":"ContainerDied","Data":"60568c7893411368ef7934b6d7f5d2360db7af5753dc4500e120b6309331f0b4"} Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.217900 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.218031 4766 generic.go:334] "Generic (PLEG): container finished" podID="b121d489-4c67-4106-aa17-ec66f896ba25" containerID="1454c674a2602e750c4beddf30cf919ff742bdcc3255241923700f7fa0a09ef8" exitCode=137 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.218136 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerName="cinder-scheduler" containerID="cri-o://d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.218225 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerName="probe" containerID="cri-o://fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.274138 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.274629 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-log" containerID="cri-o://73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.274746 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-api" containerID="cri-o://4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.306621 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" containerID="cri-o://d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" gracePeriod=29 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.317892 4766 generic.go:334] "Generic (PLEG): container finished" podID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" containerID="247c3eaf1f8095705dd3d8948a06084c404d9dd6d14b121c931d90dca158ef0e" exitCode=0 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.317968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" event={"ID":"7443d5d9-873e-430e-bcad-a90f5d4ca9c6","Type":"ContainerDied","Data":"247c3eaf1f8095705dd3d8948a06084c404d9dd6d14b121c931d90dca158ef0e"} Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.321929 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.322229 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="47770faa-9973-4d81-a630-8c344bcd7b94" containerName="cinder-api-log" containerID="cri-o://fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.322320 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="47770faa-9973-4d81-a630-8c344bcd7b94" containerName="cinder-api" containerID="cri-o://03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.350638 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerName="ovsdbserver-nb" containerID="cri-o://945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd" gracePeriod=300 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.352081 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-649b876b57-9jd5p"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.368705 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-649b876b57-9jd5p" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerName="neutron-api" containerID="cri-o://7d778b40fe61554f26c1ca0d67e81146904e79a7748a3d0c87bbd49972278c52" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.368856 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-649b876b57-9jd5p" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerName="neutron-httpd" containerID="cri-o://17927a86e4710dcdbcd88c8f33269d8edc8c5332c71c20da2f6297f153f510c1" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.374314 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d2mrs_1554173f-b66c-43d5-a5e4-cd10a81f09d4/openstack-network-exporter/0.log" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.374405 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d2mrs" event={"ID":"1554173f-b66c-43d5-a5e4-cd10a81f09d4","Type":"ContainerDied","Data":"1b0f3607954b354a6791b82c9ce793c30e8a89485397e093cde40f4f7445f1ed"} Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.374436 4766 scope.go:117] "RemoveContainer" containerID="ea40d431b90f6866f082e789a7c9306a894b603e172fe4f549ac16e4a66c9f58" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.374566 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d2mrs" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.386420 4766 generic.go:334] "Generic (PLEG): container finished" podID="f28c984f-04eb-4398-af98-9e2c5e6afd13" containerID="675041bc225bfcd7e5dd3621e06519a2045ddfa0f1dfb2b19585bfe57589d512" exitCode=0 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.386815 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz" event={"ID":"f28c984f-04eb-4398-af98-9e2c5e6afd13","Type":"ContainerDied","Data":"675041bc225bfcd7e5dd3621e06519a2045ddfa0f1dfb2b19585bfe57589d512"} Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.393596 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bf108478-7651-4f37-b0e7-3a571774d030/ovsdbserver-sb/0.log" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.393726 4766 generic.go:334] "Generic (PLEG): container finished" podID="bf108478-7651-4f37-b0e7-3a571774d030" containerID="eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033" exitCode=2 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.393759 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bf108478-7651-4f37-b0e7-3a571774d030","Type":"ContainerDied","Data":"eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033"} Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.446309 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.446544 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-log" containerID="cri-o://33f5f8cb257a0368fe88133631597afd0d390af9af8575a7cd312d29bca8a767" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.446699 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-metadata" containerID="cri-o://8f530fae2ed519e55e5f27f969322808cde1ee901eca5360c959594114369789" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.456881 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.493870 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.528675 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-59f4fbb654-hrpnd"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.530007 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" podUID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerName="barbican-keystone-listener-log" containerID="cri-o://0064b87b123267ac3d50f8f1784bd6b3c893066418c3670a19b10112ba20b3b9" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.531309 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" podUID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerName="barbican-keystone-listener" containerID="cri-o://6d7c92439814de4f38c3b7f907335c896ebf15dd2cc3e1bf15669b628a996f74" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.550287 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" containerName="rabbitmq" containerID="cri-o://962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18" gracePeriod=604800 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.553703 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" podUID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: connect: connection refused" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.564330 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-64c0-account-create-update-5fb2k"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.588051 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-64c0-account-create-update-5fb2k"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.599609 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28c984f-04eb-4398-af98-9e2c5e6afd13-scripts\") pod \"f28c984f-04eb-4398-af98-9e2c5e6afd13\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.599689 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lwfk\" (UniqueName: \"kubernetes.io/projected/f28c984f-04eb-4398-af98-9e2c5e6afd13-kube-api-access-9lwfk\") pod \"f28c984f-04eb-4398-af98-9e2c5e6afd13\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.599728 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-ovn-controller-tls-certs\") pod \"f28c984f-04eb-4398-af98-9e2c5e6afd13\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.599826 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run\") pod \"f28c984f-04eb-4398-af98-9e2c5e6afd13\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.599853 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-log-ovn\") pod \"f28c984f-04eb-4398-af98-9e2c5e6afd13\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.599868 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-combined-ca-bundle\") pod \"f28c984f-04eb-4398-af98-9e2c5e6afd13\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.599900 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run-ovn\") pod \"f28c984f-04eb-4398-af98-9e2c5e6afd13\" (UID: \"f28c984f-04eb-4398-af98-9e2c5e6afd13\") " Dec 09 03:38:13 crc kubenswrapper[4766]: E1209 03:38:13.600362 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 03:38:13 crc kubenswrapper[4766]: E1209 03:38:13.600408 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data podName:48862672-08e2-4ac6-86a3-57d84bbc868d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:15.600395538 +0000 UTC m=+1577.309700964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data") pod "rabbitmq-server-0" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d") : configmap "rabbitmq-config-data" not found Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.600860 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28c984f-04eb-4398-af98-9e2c5e6afd13-scripts" (OuterVolumeSpecName: "scripts") pod "f28c984f-04eb-4398-af98-9e2c5e6afd13" (UID: "f28c984f-04eb-4398-af98-9e2c5e6afd13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.600929 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f28c984f-04eb-4398-af98-9e2c5e6afd13" (UID: "f28c984f-04eb-4398-af98-9e2c5e6afd13"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.606767 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f28c984f-04eb-4398-af98-9e2c5e6afd13" (UID: "f28c984f-04eb-4398-af98-9e2c5e6afd13"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.606979 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run" (OuterVolumeSpecName: "var-run") pod "f28c984f-04eb-4398-af98-9e2c5e6afd13" (UID: "f28c984f-04eb-4398-af98-9e2c5e6afd13"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.611851 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28c984f-04eb-4398-af98-9e2c5e6afd13-kube-api-access-9lwfk" (OuterVolumeSpecName: "kube-api-access-9lwfk") pod "f28c984f-04eb-4398-af98-9e2c5e6afd13" (UID: "f28c984f-04eb-4398-af98-9e2c5e6afd13"). InnerVolumeSpecName "kube-api-access-9lwfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.629481 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="a57927d7-7099-4b87-99ee-77aa589cd09f" containerName="galera" containerID="cri-o://703a57a4d0aff6c0b3a602487fcde7105c12acc5a93f46652d99af88e4758cbd" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.629593 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" containerID="cri-o://1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" gracePeriod=28 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.651524 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7b4dbbdb47-q4g54"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.657186 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" containerName="barbican-worker-log" containerID="cri-o://1f7f05c852cc98494db502851dafc9d9e8f7eb88e225048ac88ebdb63d25b528" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.657475 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" containerName="barbican-worker" containerID="cri-o://6344e7369a853e77c3393dd6d7339e8c504e823cd554068d45409f77989aba7d" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.674133 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f28c984f-04eb-4398-af98-9e2c5e6afd13" (UID: "f28c984f-04eb-4398-af98-9e2c5e6afd13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.676750 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dkdgh"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.700342 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dkdgh"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.707664 4766 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.707697 4766 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.707711 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.707719 4766 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f28c984f-04eb-4398-af98-9e2c5e6afd13-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.707727 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f28c984f-04eb-4398-af98-9e2c5e6afd13-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.707736 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lwfk\" (UniqueName: \"kubernetes.io/projected/f28c984f-04eb-4398-af98-9e2c5e6afd13-kube-api-access-9lwfk\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.707887 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7dd946b7cc-x6vjx"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.708106 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" podUID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerName="proxy-httpd" containerID="cri-o://f326e4c3ffff3569aafc2007ea39c3a003c9a3599ad5eed432501fa822d61ae4" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.709262 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" podUID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerName="proxy-server" containerID="cri-o://fed7164d33b3a7af7a40fbaa9797bc168284e01eb659de885498276e5f9eafb6" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.729757 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b9bc9ddf8-vdj7m"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.729998 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api-log" containerID="cri-o://7f712ce3fa09ea2b5ba75d7f96299ac239eca89fd085f13671911969bcb7a465" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.730160 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api" containerID="cri-o://0e789a722d40821da0b1865e918a8aa00c5015bf08b60022e5a34cfa2cf5a712" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.754932 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.755230 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="354d9984-d7b5-4540-a96e-a68a7bf1b667" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ce107d8559532857d9f216c112581bb9f83cc5d523eef80c33104385461d818c" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.763117 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.763444 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="507e513c-d987-4fba-8fc0-e5ceff892afe" containerName="nova-scheduler-scheduler" containerID="cri-o://a067a547f7da427c71cff18393daf6fa745ddd3b809d72ffbb267a1fb541cd92" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.812859 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.814625 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "f28c984f-04eb-4398-af98-9e2c5e6afd13" (UID: "f28c984f-04eb-4398-af98-9e2c5e6afd13"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.833459 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.833795 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="1dfb6314-1f18-4e71-947e-534dc1021381" containerName="nova-cell1-conductor-conductor" containerID="cri-o://9ccd949b9d58181646f9d2d30d971f167213b68716379d53e92812bc4142132d" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.867772 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7kpv"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.876127 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c7kpv"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.887051 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.887268 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="ec41cb84-c47e-4199-ac5d-825bbf4f7023" containerName="nova-cell0-conductor-conductor" containerID="cri-o://d9c17096fe1163fbded2f9420441c595a2d9ae76cf7dab37fbf8a31e46afc79e" gracePeriod=30 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.910585 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28c984f-04eb-4398-af98-9e2c5e6afd13-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.914544 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5f62v"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.917182 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="48862672-08e2-4ac6-86a3-57d84bbc868d" containerName="rabbitmq" containerID="cri-o://99fdba038c7e2b9ae61670d79759f6a9042c470a8ddf322cdd4ad3f61ffc2528" gracePeriod=604800 Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.943316 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5f62v"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.976127 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance9417-account-delete-lg65s"] Dec 09 03:38:13 crc kubenswrapper[4766]: I1209 03:38:13.993322 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-d2mrs"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.047929 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-d2mrs"] Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.061868 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1 is running failed: container process not found" containerID="2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.077360 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1 is running failed: container process not found" containerID="2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.083798 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1 is running failed: container process not found" containerID="2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.083874 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="bf108478-7651-4f37-b0e7-3a571774d030" containerName="ovsdbserver-sb" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.290031 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.300415 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.329131 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config-secret\") pod \"b121d489-4c67-4106-aa17-ec66f896ba25\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.329228 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd7t7\" (UniqueName: \"kubernetes.io/projected/b121d489-4c67-4106-aa17-ec66f896ba25-kube-api-access-dd7t7\") pod \"b121d489-4c67-4106-aa17-ec66f896ba25\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.329315 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-combined-ca-bundle\") pod \"b121d489-4c67-4106-aa17-ec66f896ba25\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.329428 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config\") pod \"b121d489-4c67-4106-aa17-ec66f896ba25\" (UID: \"b121d489-4c67-4106-aa17-ec66f896ba25\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.336420 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bf108478-7651-4f37-b0e7-3a571774d030/ovsdbserver-sb/0.log" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.336490 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.358769 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b121d489-4c67-4106-aa17-ec66f896ba25-kube-api-access-dd7t7" (OuterVolumeSpecName: "kube-api-access-dd7t7") pod "b121d489-4c67-4106-aa17-ec66f896ba25" (UID: "b121d489-4c67-4106-aa17-ec66f896ba25"). InnerVolumeSpecName "kube-api-access-dd7t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.398443 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b121d489-4c67-4106-aa17-ec66f896ba25" (UID: "b121d489-4c67-4106-aa17-ec66f896ba25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.405542 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b121d489-4c67-4106-aa17-ec66f896ba25" (UID: "b121d489-4c67-4106-aa17-ec66f896ba25"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.414050 4766 scope.go:117] "RemoveContainer" containerID="1454c674a2602e750c4beddf30cf919ff742bdcc3255241923700f7fa0a09ef8" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.414526 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.432631 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-ovsdbserver-sb-tls-certs\") pod \"bf108478-7651-4f37-b0e7-3a571774d030\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.432736 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf108478-7651-4f37-b0e7-3a571774d030-ovsdb-rundir\") pod \"bf108478-7651-4f37-b0e7-3a571774d030\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.432771 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-metrics-certs-tls-certs\") pod \"bf108478-7651-4f37-b0e7-3a571774d030\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.432817 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-scripts\") pod \"bf108478-7651-4f37-b0e7-3a571774d030\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.432840 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgqf5\" (UniqueName: \"kubernetes.io/projected/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-kube-api-access-jgqf5\") pod \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.432873 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-config\") pod \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.432908 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-swift-storage-0\") pod \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.432980 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-sb\") pod \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.433027 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-nb\") pod \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.433062 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-svc\") pod \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\" (UID: \"7443d5d9-873e-430e-bcad-a90f5d4ca9c6\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.433104 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"bf108478-7651-4f37-b0e7-3a571774d030\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.433231 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m29xs\" (UniqueName: \"kubernetes.io/projected/bf108478-7651-4f37-b0e7-3a571774d030-kube-api-access-m29xs\") pod \"bf108478-7651-4f37-b0e7-3a571774d030\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.433269 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-config\") pod \"bf108478-7651-4f37-b0e7-3a571774d030\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.433309 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-combined-ca-bundle\") pod \"bf108478-7651-4f37-b0e7-3a571774d030\" (UID: \"bf108478-7651-4f37-b0e7-3a571774d030\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.434819 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.434843 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd7t7\" (UniqueName: \"kubernetes.io/projected/b121d489-4c67-4106-aa17-ec66f896ba25-kube-api-access-dd7t7\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.434864 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.447973 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicand6a3-account-delete-cjwdk"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.455352 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" event={"ID":"7443d5d9-873e-430e-bcad-a90f5d4ca9c6","Type":"ContainerDied","Data":"93a744d7458295293fcc130bdda48bf9f5f93f092bcdf76a102a27cd922c0629"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.455488 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-khb7n" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.457390 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf108478-7651-4f37-b0e7-3a571774d030-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "bf108478-7651-4f37-b0e7-3a571774d030" (UID: "bf108478-7651-4f37-b0e7-3a571774d030"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.457571 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b121d489-4c67-4106-aa17-ec66f896ba25" (UID: "b121d489-4c67-4106-aa17-ec66f896ba25"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.461108 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-config" (OuterVolumeSpecName: "config") pod "bf108478-7651-4f37-b0e7-3a571774d030" (UID: "bf108478-7651-4f37-b0e7-3a571774d030"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.459453 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement0abc-account-delete-9k2gp"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.464730 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-scripts" (OuterVolumeSpecName: "scripts") pod "bf108478-7651-4f37-b0e7-3a571774d030" (UID: "bf108478-7651-4f37-b0e7-3a571774d030"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.472757 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-kube-api-access-jgqf5" (OuterVolumeSpecName: "kube-api-access-jgqf5") pod "7443d5d9-873e-430e-bcad-a90f5d4ca9c6" (UID: "7443d5d9-873e-430e-bcad-a90f5d4ca9c6"). InnerVolumeSpecName "kube-api-access-jgqf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.492461 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf108478-7651-4f37-b0e7-3a571774d030-kube-api-access-m29xs" (OuterVolumeSpecName: "kube-api-access-m29xs") pod "bf108478-7651-4f37-b0e7-3a571774d030" (UID: "bf108478-7651-4f37-b0e7-3a571774d030"). InnerVolumeSpecName "kube-api-access-m29xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.499482 4766 generic.go:334] "Generic (PLEG): container finished" podID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.499581 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7443d5d9-873e-430e-bcad-a90f5d4ca9c6" (UID: "7443d5d9-873e-430e-bcad-a90f5d4ca9c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.499584 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h9kbs" event={"ID":"149434b0-ace1-4e8f-9be4-76eb650f7c7f","Type":"ContainerDied","Data":"d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.500913 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "bf108478-7651-4f37-b0e7-3a571774d030" (UID: "bf108478-7651-4f37-b0e7-3a571774d030"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.508095 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9417-account-delete-lg65s" event={"ID":"f227ef39-ddef-411a-96b3-96871679cae1","Type":"ContainerStarted","Data":"f3dc3e9efda4eb76a73048732b0e5928cb31c1081be17736b6813a8bd5fb7266"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.508150 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9417-account-delete-lg65s" event={"ID":"f227ef39-ddef-411a-96b3-96871679cae1","Type":"ContainerStarted","Data":"2b4533d377f99d716727f5650ad919e28c5788b4d20032a432f98e174dcbc4dc"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.512087 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bf108478-7651-4f37-b0e7-3a571774d030/ovsdbserver-sb/0.log" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.512129 4766 generic.go:334] "Generic (PLEG): container finished" podID="bf108478-7651-4f37-b0e7-3a571774d030" containerID="2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.512167 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bf108478-7651-4f37-b0e7-3a571774d030","Type":"ContainerDied","Data":"2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.512188 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bf108478-7651-4f37-b0e7-3a571774d030","Type":"ContainerDied","Data":"f46190535232e13e382b5632c2fd59eb976e2a34a4308e7615d3a96df87994ca"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.512394 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.517671 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf108478-7651-4f37-b0e7-3a571774d030" (UID: "bf108478-7651-4f37-b0e7-3a571774d030"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.518384 4766 generic.go:334] "Generic (PLEG): container finished" podID="417726b2-75fd-4efc-84ec-803533df86aa" containerID="c3015d64e08f82a88af0129948492bc3c6ad857b57c9483bd887eeb673aa3dc3" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.518495 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f669bd74-mz8xh" event={"ID":"417726b2-75fd-4efc-84ec-803533df86aa","Type":"ContainerDied","Data":"c3015d64e08f82a88af0129948492bc3c6ad857b57c9483bd887eeb673aa3dc3"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.524899 4766 generic.go:334] "Generic (PLEG): container finished" podID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerID="33f5f8cb257a0368fe88133631597afd0d390af9af8575a7cd312d29bca8a767" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.524953 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ece85ffc-6754-4f25-a66c-cf66043196b3","Type":"ContainerDied","Data":"33f5f8cb257a0368fe88133631597afd0d390af9af8575a7cd312d29bca8a767"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.528700 4766 generic.go:334] "Generic (PLEG): container finished" podID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerID="17927a86e4710dcdbcd88c8f33269d8edc8c5332c71c20da2f6297f153f510c1" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.528784 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649b876b57-9jd5p" event={"ID":"be23a05e-591f-4bdf-9c5f-8ee930181397","Type":"ContainerDied","Data":"17927a86e4710dcdbcd88c8f33269d8edc8c5332c71c20da2f6297f153f510c1"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.531523 4766 generic.go:334] "Generic (PLEG): container finished" podID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerID="fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.531555 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3cf59e92-46b9-4693-b9ec-1a669b0e3897","Type":"ContainerDied","Data":"fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.536194 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.536228 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf108478-7651-4f37-b0e7-3a571774d030-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.536236 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.536246 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgqf5\" (UniqueName: \"kubernetes.io/projected/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-kube-api-access-jgqf5\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.536255 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.536283 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.536293 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b121d489-4c67-4106-aa17-ec66f896ba25-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.536301 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m29xs\" (UniqueName: \"kubernetes.io/projected/bf108478-7651-4f37-b0e7-3a571774d030-kube-api-access-m29xs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.536309 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf108478-7651-4f37-b0e7-3a571774d030-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.544237 4766 generic.go:334] "Generic (PLEG): container finished" podID="354d9984-d7b5-4540-a96e-a68a7bf1b667" containerID="ce107d8559532857d9f216c112581bb9f83cc5d523eef80c33104385461d818c" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.544500 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"354d9984-d7b5-4540-a96e-a68a7bf1b667","Type":"ContainerDied","Data":"ce107d8559532857d9f216c112581bb9f83cc5d523eef80c33104385461d818c"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.545819 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7443d5d9-873e-430e-bcad-a90f5d4ca9c6" (UID: "7443d5d9-873e-430e-bcad-a90f5d4ca9c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.563446 4766 generic.go:334] "Generic (PLEG): container finished" podID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerID="4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.563520 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6a00c8b-af47-4254-83de-a93a975b3afe","Type":"ContainerDied","Data":"4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.569877 4766 generic.go:334] "Generic (PLEG): container finished" podID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerID="fed7164d33b3a7af7a40fbaa9797bc168284e01eb659de885498276e5f9eafb6" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.569904 4766 generic.go:334] "Generic (PLEG): container finished" podID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerID="f326e4c3ffff3569aafc2007ea39c3a003c9a3599ad5eed432501fa822d61ae4" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.569943 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" event={"ID":"2ff493c4-bb15-4a40-9499-ca23bf79f42b","Type":"ContainerDied","Data":"fed7164d33b3a7af7a40fbaa9797bc168284e01eb659de885498276e5f9eafb6"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.569967 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" event={"ID":"2ff493c4-bb15-4a40-9499-ca23bf79f42b","Type":"ContainerDied","Data":"f326e4c3ffff3569aafc2007ea39c3a003c9a3599ad5eed432501fa822d61ae4"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.580036 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.586736 4766 generic.go:334] "Generic (PLEG): container finished" podID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerID="73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.586830 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99d3047-bb16-4bbe-a77d-0f4199121e7d","Type":"ContainerDied","Data":"73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.595270 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s25fz" event={"ID":"f28c984f-04eb-4398-af98-9e2c5e6afd13","Type":"ContainerDied","Data":"c92b2db5364b6a896e5c6ccd0c8f9f1bcb96154372510e3ce5fc3f0d3bc9e4e8"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.595483 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s25fz" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.605903 4766 generic.go:334] "Generic (PLEG): container finished" podID="5edf46d6-e570-425b-843d-d67f5adde599" containerID="7f712ce3fa09ea2b5ba75d7f96299ac239eca89fd085f13671911969bcb7a465" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.605961 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" event={"ID":"5edf46d6-e570-425b-843d-d67f5adde599","Type":"ContainerDied","Data":"7f712ce3fa09ea2b5ba75d7f96299ac239eca89fd085f13671911969bcb7a465"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.617604 4766 generic.go:334] "Generic (PLEG): container finished" podID="04522868-a66d-44f8-a9bb-6f157f26653f" containerID="1f7f05c852cc98494db502851dafc9d9e8f7eb88e225048ac88ebdb63d25b528" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.617683 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" event={"ID":"04522868-a66d-44f8-a9bb-6f157f26653f","Type":"ContainerDied","Data":"1f7f05c852cc98494db502851dafc9d9e8f7eb88e225048ac88ebdb63d25b528"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.634666 4766 generic.go:334] "Generic (PLEG): container finished" podID="47770faa-9973-4d81-a630-8c344bcd7b94" containerID="fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.634865 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"47770faa-9973-4d81-a630-8c344bcd7b94","Type":"ContainerDied","Data":"fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.637918 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.637940 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.638009 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.638068 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data podName:3af438c1-d0b9-4ecb-bb88-a0efd14736a4 nodeName:}" failed. No retries permitted until 2025-12-09 03:38:18.6380491 +0000 UTC m=+1580.347354566 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data") pod "rabbitmq-cell1-server-0" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4") : configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.652231 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2bb7e0f1-e5a8-45ef-9cbf-e308e82968af/ovsdbserver-nb/0.log" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.652308 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.656701 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-config" (OuterVolumeSpecName: "config") pod "7443d5d9-873e-430e-bcad-a90f5d4ca9c6" (UID: "7443d5d9-873e-430e-bcad-a90f5d4ca9c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.661457 4766 scope.go:117] "RemoveContainer" containerID="247c3eaf1f8095705dd3d8948a06084c404d9dd6d14b121c931d90dca158ef0e" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.668766 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7443d5d9-873e-430e-bcad-a90f5d4ca9c6" (UID: "7443d5d9-873e-430e-bcad-a90f5d4ca9c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.670755 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7443d5d9-873e-430e-bcad-a90f5d4ca9c6" (UID: "7443d5d9-873e-430e-bcad-a90f5d4ca9c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671851 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671872 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671882 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671889 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671896 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671902 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671908 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671913 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671920 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671926 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671932 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671938 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671944 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671950 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14" exitCode=0 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671955 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.671989 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672001 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672014 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672023 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672032 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672041 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672049 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672059 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672067 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672075 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672083 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672091 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.672100 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.674391 4766 generic.go:334] "Generic (PLEG): container finished" podID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerID="0064b87b123267ac3d50f8f1784bd6b3c893066418c3670a19b10112ba20b3b9" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.674455 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" event={"ID":"7ae0a18c-f118-45b5-8989-9ca3a49827ad","Type":"ContainerDied","Data":"0064b87b123267ac3d50f8f1784bd6b3c893066418c3670a19b10112ba20b3b9"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.674894 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "bf108478-7651-4f37-b0e7-3a571774d030" (UID: "bf108478-7651-4f37-b0e7-3a571774d030"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.676164 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2bb7e0f1-e5a8-45ef-9cbf-e308e82968af/ovsdbserver-nb/0.log" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.676221 4766 generic.go:334] "Generic (PLEG): container finished" podID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerID="16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754" exitCode=2 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.676234 4766 generic.go:334] "Generic (PLEG): container finished" podID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerID="945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd" exitCode=143 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.676251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af","Type":"ContainerDied","Data":"16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.676269 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af","Type":"ContainerDied","Data":"945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd"} Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.676340 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.689821 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s25fz"] Dec 09 03:38:14 crc kubenswrapper[4766]: W1209 03:38:14.705487 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d1796c_9c3d_444c_bda3_2a7525ac2650.slice/crio-1518ae646abca79f08536d3239e5b6ffa1e4128f020d18e52bc568634df2d560 WatchSource:0}: Error finding container 1518ae646abca79f08536d3239e5b6ffa1e4128f020d18e52bc568634df2d560: Status 404 returned error can't find the container with id 1518ae646abca79f08536d3239e5b6ffa1e4128f020d18e52bc568634df2d560 Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.707812 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s25fz"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.714912 4766 scope.go:117] "RemoveContainer" containerID="e1349528a0d97478daa0910811d2f8c4ed839212e46f900922e25dfb57923704" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.729814 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronc2af-account-delete-zxgmn"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.737394 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell00dd1-account-delete-flgqj"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.739601 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvhdn\" (UniqueName: \"kubernetes.io/projected/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-kube-api-access-mvhdn\") pod \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.739706 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-config\") pod \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.739744 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-scripts\") pod \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.739830 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdbserver-nb-tls-certs\") pod \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.739899 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdb-rundir\") pod \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.739923 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-combined-ca-bundle\") pod \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.739994 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.740042 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-metrics-certs-tls-certs\") pod \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\" (UID: \"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af\") " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.740450 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.740469 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.740479 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.740491 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7443d5d9-873e-430e-bcad-a90f5d4ca9c6-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.741488 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-config" (OuterVolumeSpecName: "config") pod "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" (UID: "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.743486 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" (UID: "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.743964 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-scripts" (OuterVolumeSpecName: "scripts") pod "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" (UID: "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.744859 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-kube-api-access-mvhdn" (OuterVolumeSpecName: "kube-api-access-mvhdn") pod "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" (UID: "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af"). InnerVolumeSpecName "kube-api-access-mvhdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.745387 4766 scope.go:117] "RemoveContainer" containerID="eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.759878 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "bf108478-7651-4f37-b0e7-3a571774d030" (UID: "bf108478-7651-4f37-b0e7-3a571774d030"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.785458 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" (UID: "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.809382 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" (UID: "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.824365 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-khb7n"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.824448 4766 scope.go:117] "RemoveContainer" containerID="2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.836804 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-khb7n"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.842432 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.842484 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf108478-7651-4f37-b0e7-3a571774d030-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.842497 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.842528 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.842559 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvhdn\" (UniqueName: \"kubernetes.io/projected/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-kube-api-access-mvhdn\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.842574 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.842585 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.877983 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="068033ee-d9d8-4cbb-b82a-ced63f563e08" path="/var/lib/kubelet/pods/068033ee-d9d8-4cbb-b82a-ced63f563e08/volumes" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.878948 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ff86be1-0422-4655-ba5c-b063b8c42a61" path="/var/lib/kubelet/pods/0ff86be1-0422-4655-ba5c-b063b8c42a61/volumes" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.879539 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1554173f-b66c-43d5-a5e4-cd10a81f09d4" path="/var/lib/kubelet/pods/1554173f-b66c-43d5-a5e4-cd10a81f09d4/volumes" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.880844 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c8159d8-7758-4a35-ba5c-51ce1edae988" path="/var/lib/kubelet/pods/1c8159d8-7758-4a35-ba5c-51ce1edae988/volumes" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.881817 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" path="/var/lib/kubelet/pods/7443d5d9-873e-430e-bcad-a90f5d4ca9c6/volumes" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.882417 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b121d489-4c67-4106-aa17-ec66f896ba25" path="/var/lib/kubelet/pods/b121d489-4c67-4106-aa17-ec66f896ba25/volumes" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.882875 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0c079d8-5ee5-42ba-ad34-165566762c81" path="/var/lib/kubelet/pods/c0c079d8-5ee5-42ba-ad34-165566762c81/volumes" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.883772 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4623229-9cd3-4c95-bba1-1c202cb8f07c" path="/var/lib/kubelet/pods/e4623229-9cd3-4c95-bba1-1c202cb8f07c/volumes" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.884441 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28c984f-04eb-4398-af98-9e2c5e6afd13" path="/var/lib/kubelet/pods/f28c984f-04eb-4398-af98-9e2c5e6afd13/volumes" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.887980 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.890326 4766 scope.go:117] "RemoveContainer" containerID="eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033" Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.891352 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033\": container with ID starting with eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033 not found: ID does not exist" containerID="eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.891390 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033"} err="failed to get container status \"eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033\": rpc error: code = NotFound desc = could not find container \"eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033\": container with ID starting with eaf4e0a6760eabf645a303215fe894130c757119f758e9d2b87d451a0580a033 not found: ID does not exist" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.891413 4766 scope.go:117] "RemoveContainer" containerID="2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1" Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.891919 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1\": container with ID starting with 2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1 not found: ID does not exist" containerID="2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.892000 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.892006 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1"} err="failed to get container status \"2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1\": rpc error: code = NotFound desc = could not find container \"2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1\": container with ID starting with 2e715c21b3148e95f9916f9617a8f70a6a0adbe76ff6886108b5d09c152873e1 not found: ID does not exist" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.892142 4766 scope.go:117] "RemoveContainer" containerID="675041bc225bfcd7e5dd3621e06519a2045ddfa0f1dfb2b19585bfe57589d512" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.917055 4766 scope.go:117] "RemoveContainer" containerID="16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.957683 4766 scope.go:117] "RemoveContainer" containerID="945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.976509 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.986387 4766 scope.go:117] "RemoveContainer" containerID="16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754" Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.987953 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754\": container with ID starting with 16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754 not found: ID does not exist" containerID="16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.987984 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754"} err="failed to get container status \"16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754\": rpc error: code = NotFound desc = could not find container \"16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754\": container with ID starting with 16ed7709722412c384beb0350dabcc6ab540fc4cb30857497479ebe9bb05c754 not found: ID does not exist" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.988004 4766 scope.go:117] "RemoveContainer" containerID="945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd" Dec 09 03:38:14 crc kubenswrapper[4766]: E1209 03:38:14.989054 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd\": container with ID starting with 945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd not found: ID does not exist" containerID="945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd" Dec 09 03:38:14 crc kubenswrapper[4766]: I1209 03:38:14.989091 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd"} err="failed to get container status \"945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd\": rpc error: code = NotFound desc = could not find container \"945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd\": container with ID starting with 945c2e1ecd37317478125e04955e0e293f6b4e43fff46c7f7b05030d26dccddd not found: ID does not exist" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.046470 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.082878 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.148444 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-internal-tls-certs\") pod \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.148560 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-run-httpd\") pod \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.148614 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8jxk\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-kube-api-access-c8jxk\") pod \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.148654 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-combined-ca-bundle\") pod \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.148700 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-public-tls-certs\") pod \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.148734 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-config-data\") pod \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.148757 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-log-httpd\") pod \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.148808 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-etc-swift\") pod \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\" (UID: \"2ff493c4-bb15-4a40-9499-ca23bf79f42b\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.150445 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ff493c4-bb15-4a40-9499-ca23bf79f42b" (UID: "2ff493c4-bb15-4a40-9499-ca23bf79f42b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.152243 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ff493c4-bb15-4a40-9499-ca23bf79f42b" (UID: "2ff493c4-bb15-4a40-9499-ca23bf79f42b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.159325 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2ff493c4-bb15-4a40-9499-ca23bf79f42b" (UID: "2ff493c4-bb15-4a40-9499-ca23bf79f42b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.166945 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-kube-api-access-c8jxk" (OuterVolumeSpecName: "kube-api-access-c8jxk") pod "2ff493c4-bb15-4a40-9499-ca23bf79f42b" (UID: "2ff493c4-bb15-4a40-9499-ca23bf79f42b"). InnerVolumeSpecName "kube-api-access-c8jxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.243567 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" (UID: "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.258023 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" (UID: "2bb7e0f1-e5a8-45ef-9cbf-e308e82968af"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.264699 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.264730 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8jxk\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-kube-api-access-c8jxk\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.264742 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ff493c4-bb15-4a40-9499-ca23bf79f42b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.264753 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.264767 4766 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ff493c4-bb15-4a40-9499-ca23bf79f42b-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.274345 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff493c4-bb15-4a40-9499-ca23bf79f42b" (UID: "2ff493c4-bb15-4a40-9499-ca23bf79f42b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.289500 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2ff493c4-bb15-4a40-9499-ca23bf79f42b" (UID: "2ff493c4-bb15-4a40-9499-ca23bf79f42b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.291393 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-config-data" (OuterVolumeSpecName: "config-data") pod "2ff493c4-bb15-4a40-9499-ca23bf79f42b" (UID: "2ff493c4-bb15-4a40-9499-ca23bf79f42b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.317395 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2ff493c4-bb15-4a40-9499-ca23bf79f42b" (UID: "2ff493c4-bb15-4a40-9499-ca23bf79f42b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.366830 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.366858 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.366867 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.366876 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.366888 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff493c4-bb15-4a40-9499-ca23bf79f42b-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.447935 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderda2b-account-delete-kmsx2"] Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.506940 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapibaee-account-delete-wvgpg"] Dec 09 03:38:15 crc kubenswrapper[4766]: W1209 03:38:15.544413 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77345185_b7d7_46d4_9e72_251bac080f3a.slice/crio-d66e936afde637eec05db5d0f8188e162f3620b11a8361f5764713663ae26831 WatchSource:0}: Error finding container d66e936afde637eec05db5d0f8188e162f3620b11a8361f5764713663ae26831: Status 404 returned error can't find the container with id d66e936afde637eec05db5d0f8188e162f3620b11a8361f5764713663ae26831 Dec 09 03:38:15 crc kubenswrapper[4766]: E1209 03:38:15.679657 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 03:38:15 crc kubenswrapper[4766]: E1209 03:38:15.679734 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data podName:48862672-08e2-4ac6-86a3-57d84bbc868d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:19.67971537 +0000 UTC m=+1581.389020796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data") pod "rabbitmq-server-0" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d") : configmap "rabbitmq-config-data" not found Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.691443 4766 generic.go:334] "Generic (PLEG): container finished" podID="f227ef39-ddef-411a-96b3-96871679cae1" containerID="f3dc3e9efda4eb76a73048732b0e5928cb31c1081be17736b6813a8bd5fb7266" exitCode=0 Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.691533 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9417-account-delete-lg65s" event={"ID":"f227ef39-ddef-411a-96b3-96871679cae1","Type":"ContainerDied","Data":"f3dc3e9efda4eb76a73048732b0e5928cb31c1081be17736b6813a8bd5fb7266"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.698944 4766 generic.go:334] "Generic (PLEG): container finished" podID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerID="6d7c92439814de4f38c3b7f907335c896ebf15dd2cc3e1bf15669b628a996f74" exitCode=0 Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.699013 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" event={"ID":"7ae0a18c-f118-45b5-8989-9ca3a49827ad","Type":"ContainerDied","Data":"6d7c92439814de4f38c3b7f907335c896ebf15dd2cc3e1bf15669b628a996f74"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.704967 4766 generic.go:334] "Generic (PLEG): container finished" podID="a764f29a-d427-43a5-833f-34b2064c122d" containerID="25a975fcaebea6149ff93cc7ae0f516d54a7143da50c55f98fc5ae61d5ecb604" exitCode=0 Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.705066 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0abc-account-delete-9k2gp" event={"ID":"a764f29a-d427-43a5-833f-34b2064c122d","Type":"ContainerDied","Data":"25a975fcaebea6149ff93cc7ae0f516d54a7143da50c55f98fc5ae61d5ecb604"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.705093 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0abc-account-delete-9k2gp" event={"ID":"a764f29a-d427-43a5-833f-34b2064c122d","Type":"ContainerStarted","Data":"f964c089d41b85fc1e3d8993a58705d0549a7ab94fd29c88c613be604a83b1ec"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.714096 4766 generic.go:334] "Generic (PLEG): container finished" podID="a57927d7-7099-4b87-99ee-77aa589cd09f" containerID="703a57a4d0aff6c0b3a602487fcde7105c12acc5a93f46652d99af88e4758cbd" exitCode=0 Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.714198 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a57927d7-7099-4b87-99ee-77aa589cd09f","Type":"ContainerDied","Data":"703a57a4d0aff6c0b3a602487fcde7105c12acc5a93f46652d99af88e4758cbd"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.714373 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a57927d7-7099-4b87-99ee-77aa589cd09f","Type":"ContainerDied","Data":"137fa822e08a33698279473229d907147e31d90bbc7c9e4dcb25c748c6933d73"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.714391 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="137fa822e08a33698279473229d907147e31d90bbc7c9e4dcb25c748c6933d73" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.719724 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibaee-account-delete-wvgpg" event={"ID":"77345185-b7d7-46d4-9e72-251bac080f3a","Type":"ContainerStarted","Data":"d66e936afde637eec05db5d0f8188e162f3620b11a8361f5764713663ae26831"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.724308 4766 generic.go:334] "Generic (PLEG): container finished" podID="8f5be126-5890-4cef-aa82-3bdeef1918cd" containerID="7a3e415d4be0c928fbd52e3bfdb8da0e258cbf44b903072d268634c023795eab" exitCode=0 Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.724359 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicand6a3-account-delete-cjwdk" event={"ID":"8f5be126-5890-4cef-aa82-3bdeef1918cd","Type":"ContainerDied","Data":"7a3e415d4be0c928fbd52e3bfdb8da0e258cbf44b903072d268634c023795eab"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.724381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicand6a3-account-delete-cjwdk" event={"ID":"8f5be126-5890-4cef-aa82-3bdeef1918cd","Type":"ContainerStarted","Data":"72d8bba3608afcd65c56a78c71842042d1f3b5ec0095ba80470e00475ba35ae9"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.728763 4766 generic.go:334] "Generic (PLEG): container finished" podID="04522868-a66d-44f8-a9bb-6f157f26653f" containerID="6344e7369a853e77c3393dd6d7339e8c504e823cd554068d45409f77989aba7d" exitCode=0 Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.728809 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" event={"ID":"04522868-a66d-44f8-a9bb-6f157f26653f","Type":"ContainerDied","Data":"6344e7369a853e77c3393dd6d7339e8c504e823cd554068d45409f77989aba7d"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.728827 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" event={"ID":"04522868-a66d-44f8-a9bb-6f157f26653f","Type":"ContainerDied","Data":"b807a1c23dc3b78a2932a9d12cb33a247ec78fbd973ff078dc74c3646076ffb2"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.728838 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b807a1c23dc3b78a2932a9d12cb33a247ec78fbd973ff078dc74c3646076ffb2" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.729584 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00dd1-account-delete-flgqj" event={"ID":"8b67744f-1b24-4baf-b397-26cff83c2a4d","Type":"ContainerStarted","Data":"9dac53f58fd3a7c0ee08d6358faf161329b52a6ebc57d4fa08e84a3534b2440c"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.731832 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" event={"ID":"2ff493c4-bb15-4a40-9499-ca23bf79f42b","Type":"ContainerDied","Data":"5a7a3cc2e9290abbc06f47ea749606fbcae32bc77197854c9ed797c2530224b6"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.731863 4766 scope.go:117] "RemoveContainer" containerID="fed7164d33b3a7af7a40fbaa9797bc168284e01eb659de885498276e5f9eafb6" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.731957 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7dd946b7cc-x6vjx" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.740690 4766 generic.go:334] "Generic (PLEG): container finished" podID="15d1796c-9c3d-444c-bda3-2a7525ac2650" containerID="ef3231ca6d75883603382318582483a2a49c6400d230ad1c3f2e0b1d23a36876" exitCode=0 Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.740745 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronc2af-account-delete-zxgmn" event={"ID":"15d1796c-9c3d-444c-bda3-2a7525ac2650","Type":"ContainerDied","Data":"ef3231ca6d75883603382318582483a2a49c6400d230ad1c3f2e0b1d23a36876"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.740763 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronc2af-account-delete-zxgmn" event={"ID":"15d1796c-9c3d-444c-bda3-2a7525ac2650","Type":"ContainerStarted","Data":"1518ae646abca79f08536d3239e5b6ffa1e4128f020d18e52bc568634df2d560"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.756163 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2bb7e0f1-e5a8-45ef-9cbf-e308e82968af","Type":"ContainerDied","Data":"a4a8ba2bfdd506a1c1b89820974c6e51623c0fe28b6eedb24965e834e14c36cd"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.769711 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderda2b-account-delete-kmsx2" event={"ID":"604402ee-2350-4cec-8a56-b5203c3287e8","Type":"ContainerStarted","Data":"1abee6058ecbf6347063a4c2be175b5b3754936923625fc26ba3b16a8bb825e0"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.793018 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"354d9984-d7b5-4540-a96e-a68a7bf1b667","Type":"ContainerDied","Data":"9f2831e19b47104f2e7157a8f0f1a5b07c69cf3c45b38540e44f7797bf0fae3c"} Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.793062 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f2831e19b47104f2e7157a8f0f1a5b07c69cf3c45b38540e44f7797bf0fae3c" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.871992 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.937171 4766 scope.go:117] "RemoveContainer" containerID="f326e4c3ffff3569aafc2007ea39c3a003c9a3599ad5eed432501fa822d61ae4" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.962482 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.993752 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvgc8\" (UniqueName: \"kubernetes.io/projected/a57927d7-7099-4b87-99ee-77aa589cd09f-kube-api-access-wvgc8\") pod \"a57927d7-7099-4b87-99ee-77aa589cd09f\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.993799 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-generated\") pod \"a57927d7-7099-4b87-99ee-77aa589cd09f\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.993843 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-kolla-config\") pod \"a57927d7-7099-4b87-99ee-77aa589cd09f\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.993913 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-combined-ca-bundle\") pod \"a57927d7-7099-4b87-99ee-77aa589cd09f\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.993961 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-default\") pod \"a57927d7-7099-4b87-99ee-77aa589cd09f\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.994142 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-operator-scripts\") pod \"a57927d7-7099-4b87-99ee-77aa589cd09f\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.994478 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"a57927d7-7099-4b87-99ee-77aa589cd09f\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.994582 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-galera-tls-certs\") pod \"a57927d7-7099-4b87-99ee-77aa589cd09f\" (UID: \"a57927d7-7099-4b87-99ee-77aa589cd09f\") " Dec 09 03:38:15 crc kubenswrapper[4766]: I1209 03:38:15.999201 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "a57927d7-7099-4b87-99ee-77aa589cd09f" (UID: "a57927d7-7099-4b87-99ee-77aa589cd09f"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.001568 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a57927d7-7099-4b87-99ee-77aa589cd09f" (UID: "a57927d7-7099-4b87-99ee-77aa589cd09f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.001578 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "a57927d7-7099-4b87-99ee-77aa589cd09f" (UID: "a57927d7-7099-4b87-99ee-77aa589cd09f"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.003385 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7dd946b7cc-x6vjx"] Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.014047 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7dd946b7cc-x6vjx"] Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.020680 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a57927d7-7099-4b87-99ee-77aa589cd09f" (UID: "a57927d7-7099-4b87-99ee-77aa589cd09f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:16 crc kubenswrapper[4766]: E1209 03:38:16.027459 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 03:38:16 crc kubenswrapper[4766]: E1209 03:38:16.029044 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.040455 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57927d7-7099-4b87-99ee-77aa589cd09f-kube-api-access-wvgc8" (OuterVolumeSpecName: "kube-api-access-wvgc8") pod "a57927d7-7099-4b87-99ee-77aa589cd09f" (UID: "a57927d7-7099-4b87-99ee-77aa589cd09f"). InnerVolumeSpecName "kube-api-access-wvgc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.041240 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:38:16 crc kubenswrapper[4766]: E1209 03:38:16.041551 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 09 03:38:16 crc kubenswrapper[4766]: E1209 03:38:16.041611 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="ovn-northd" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.067941 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.070043 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.084826 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.091019 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "a57927d7-7099-4b87-99ee-77aa589cd09f" (UID: "a57927d7-7099-4b87-99ee-77aa589cd09f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.098817 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04522868-a66d-44f8-a9bb-6f157f26653f-logs\") pod \"04522868-a66d-44f8-a9bb-6f157f26653f\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.098896 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-config-data\") pod \"354d9984-d7b5-4540-a96e-a68a7bf1b667\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.098998 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data-custom\") pod \"04522868-a66d-44f8-a9bb-6f157f26653f\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099020 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-combined-ca-bundle\") pod \"04522868-a66d-44f8-a9bb-6f157f26653f\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099044 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-vencrypt-tls-certs\") pod \"354d9984-d7b5-4540-a96e-a68a7bf1b667\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099067 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data\") pod \"04522868-a66d-44f8-a9bb-6f157f26653f\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099098 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2zpw\" (UniqueName: \"kubernetes.io/projected/04522868-a66d-44f8-a9bb-6f157f26653f-kube-api-access-d2zpw\") pod \"04522868-a66d-44f8-a9bb-6f157f26653f\" (UID: \"04522868-a66d-44f8-a9bb-6f157f26653f\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099124 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-combined-ca-bundle\") pod \"354d9984-d7b5-4540-a96e-a68a7bf1b667\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099174 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxq6l\" (UniqueName: \"kubernetes.io/projected/354d9984-d7b5-4540-a96e-a68a7bf1b667-kube-api-access-dxq6l\") pod \"354d9984-d7b5-4540-a96e-a68a7bf1b667\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099256 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-nova-novncproxy-tls-certs\") pod \"354d9984-d7b5-4540-a96e-a68a7bf1b667\" (UID: \"354d9984-d7b5-4540-a96e-a68a7bf1b667\") " Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099269 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04522868-a66d-44f8-a9bb-6f157f26653f-logs" (OuterVolumeSpecName: "logs") pod "04522868-a66d-44f8-a9bb-6f157f26653f" (UID: "04522868-a66d-44f8-a9bb-6f157f26653f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099649 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvgc8\" (UniqueName: \"kubernetes.io/projected/a57927d7-7099-4b87-99ee-77aa589cd09f-kube-api-access-wvgc8\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099660 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099669 4766 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099678 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099685 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04522868-a66d-44f8-a9bb-6f157f26653f-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099693 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a57927d7-7099-4b87-99ee-77aa589cd09f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:16 crc kubenswrapper[4766]: I1209 03:38:16.099714 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.127378 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9ccd949b9d58181646f9d2d30d971f167213b68716379d53e92812bc4142132d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.128873 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9ccd949b9d58181646f9d2d30d971f167213b68716379d53e92812bc4142132d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.130031 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9ccd949b9d58181646f9d2d30d971f167213b68716379d53e92812bc4142132d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.130062 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="1dfb6314-1f18-4e71-947e-534dc1021381" containerName="nova-cell1-conductor-conductor" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.131295 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "04522868-a66d-44f8-a9bb-6f157f26653f" (UID: "04522868-a66d-44f8-a9bb-6f157f26653f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.147953 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354d9984-d7b5-4540-a96e-a68a7bf1b667-kube-api-access-dxq6l" (OuterVolumeSpecName: "kube-api-access-dxq6l") pod "354d9984-d7b5-4540-a96e-a68a7bf1b667" (UID: "354d9984-d7b5-4540-a96e-a68a7bf1b667"). InnerVolumeSpecName "kube-api-access-dxq6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.157012 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04522868-a66d-44f8-a9bb-6f157f26653f-kube-api-access-d2zpw" (OuterVolumeSpecName: "kube-api-access-d2zpw") pod "04522868-a66d-44f8-a9bb-6f157f26653f" (UID: "04522868-a66d-44f8-a9bb-6f157f26653f"). InnerVolumeSpecName "kube-api-access-d2zpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.201622 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data-custom\") pod \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.201861 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data\") pod \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.201917 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae0a18c-f118-45b5-8989-9ca3a49827ad-logs\") pod \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.201956 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-combined-ca-bundle\") pod \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.201988 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn7bj\" (UniqueName: \"kubernetes.io/projected/7ae0a18c-f118-45b5-8989-9ca3a49827ad-kube-api-access-pn7bj\") pod \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\" (UID: \"7ae0a18c-f118-45b5-8989-9ca3a49827ad\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.202375 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.202388 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2zpw\" (UniqueName: \"kubernetes.io/projected/04522868-a66d-44f8-a9bb-6f157f26653f-kube-api-access-d2zpw\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.202398 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxq6l\" (UniqueName: \"kubernetes.io/projected/354d9984-d7b5-4540-a96e-a68a7bf1b667-kube-api-access-dxq6l\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.203189 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae0a18c-f118-45b5-8989-9ca3a49827ad-logs" (OuterVolumeSpecName: "logs") pod "7ae0a18c-f118-45b5-8989-9ca3a49827ad" (UID: "7ae0a18c-f118-45b5-8989-9ca3a49827ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.238385 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a57927d7-7099-4b87-99ee-77aa589cd09f" (UID: "a57927d7-7099-4b87-99ee-77aa589cd09f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.256023 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae0a18c-f118-45b5-8989-9ca3a49827ad-kube-api-access-pn7bj" (OuterVolumeSpecName: "kube-api-access-pn7bj") pod "7ae0a18c-f118-45b5-8989-9ca3a49827ad" (UID: "7ae0a18c-f118-45b5-8989-9ca3a49827ad"). InnerVolumeSpecName "kube-api-access-pn7bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.256987 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.260499 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="ceilometer-central-agent" containerID="cri-o://d74114f82622e892b4a5e3b24618b3e3f03c2d8600dbbfb022a279d06521b94a" gracePeriod=30 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.260888 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="proxy-httpd" containerID="cri-o://7f14118b1949b7f4fa11ccf3e1c0978b97f42aeb4702c3b2333fce0c82a717cf" gracePeriod=30 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.260935 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="sg-core" containerID="cri-o://6d71bb4e4b5fbaf252c35fbed7673f51dd06fc7cc2b55da69c2bf9d00916cfed" gracePeriod=30 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.260968 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="ceilometer-notification-agent" containerID="cri-o://f4f799754c7d424d9e1a02edb786aa1d853e958a00bbe81227da9a2cf14ca4ee" gracePeriod=30 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.261771 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ae0a18c-f118-45b5-8989-9ca3a49827ad" (UID: "7ae0a18c-f118-45b5-8989-9ca3a49827ad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.306654 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.306691 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae0a18c-f118-45b5-8989-9ca3a49827ad-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.306703 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn7bj\" (UniqueName: \"kubernetes.io/projected/7ae0a18c-f118-45b5-8989-9ca3a49827ad-kube-api-access-pn7bj\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.306717 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.310919 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.311115 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d1f2b76e-7443-46d3-a296-76196dcc28b7" containerName="kube-state-metrics" containerID="cri-o://521a97f36468751bc00689dc257329f3eef8640694009d32dbf2669dae3f1809" gracePeriod=30 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.453963 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.454205 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="01ada9c6-91af-4717-a157-29070bf61a6e" containerName="memcached" containerID="cri-o://d07009fe163e1ff1f0efb68a9304c65721f93731af9b5db45462d77518e37f29" gracePeriod=30 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.483472 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.526065 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.567597 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wmp64"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.583340 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ltkbl"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.589375 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wmp64"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.603235 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ltkbl"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.608366 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5fdb76977-cn2gb"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.608620 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5fdb76977-cn2gb" podUID="18f03bec-d533-450d-b79b-7f19dc436d94" containerName="keystone-api" containerID="cri-o://425a19b5cf582ca012bbb5a8e1a477b968972c4db75ac0ca49551bb4ae484803" gracePeriod=30 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.626621 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.642273 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mwwn7"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.649653 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mwwn7"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.655762 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-baee-account-create-update-2lz8j"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.666230 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ae0a18c-f118-45b5-8989-9ca3a49827ad" (UID: "7ae0a18c-f118-45b5-8989-9ca3a49827ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.671400 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.671491 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.671588 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.672364 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.672482 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.672523 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.675380 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.675465 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.685486 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-baee-account-create-update-2lz8j"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.695205 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibaee-account-delete-wvgpg"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.700950 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-67ng6"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.711136 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-67ng6"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.724564 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-49e7-account-create-update-ctlzz"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.731654 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data" (OuterVolumeSpecName: "config-data") pod "04522868-a66d-44f8-a9bb-6f157f26653f" (UID: "04522868-a66d-44f8-a9bb-6f157f26653f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.733649 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-49e7-account-create-update-ctlzz"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.762326 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-m4l8d"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.763645 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.763672 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.774101 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-m4l8d"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.792258 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-da2b-account-create-update-pq9nx"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.796365 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "354d9984-d7b5-4540-a96e-a68a7bf1b667" (UID: "354d9984-d7b5-4540-a96e-a68a7bf1b667"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.803169 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-da2b-account-create-update-pq9nx"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.815750 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "a57927d7-7099-4b87-99ee-77aa589cd09f" (UID: "a57927d7-7099-4b87-99ee-77aa589cd09f"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.817647 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibaee-account-delete-wvgpg" event={"ID":"77345185-b7d7-46d4-9e72-251bac080f3a","Type":"ContainerStarted","Data":"e72c786adf539230163f0d59c1d7e3330d6b3bfa65aeb081d7df9f3a4a6897e1"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.818116 4766 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapibaee-account-delete-wvgpg" secret="" err="secret \"galera-openstack-dockercfg-g8hnh\" not found" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.823046 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00dd1-account-delete-flgqj" event={"ID":"8b67744f-1b24-4baf-b397-26cff83c2a4d","Type":"ContainerStarted","Data":"b964f8ee7b5a62d95ec8c21e2f2cb9480262bdc47471ecb02c71bcf732440066"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.823587 4766 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell00dd1-account-delete-flgqj" secret="" err="secret \"galera-openstack-dockercfg-g8hnh\" not found" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.840538 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9417-account-delete-lg65s" event={"ID":"f227ef39-ddef-411a-96b3-96871679cae1","Type":"ContainerDied","Data":"2b4533d377f99d716727f5650ad919e28c5788b4d20032a432f98e174dcbc4dc"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.840572 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b4533d377f99d716727f5650ad919e28c5788b4d20032a432f98e174dcbc4dc" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.858818 4766 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/cinderda2b-account-delete-kmsx2" secret="" err="secret \"galera-openstack-dockercfg-g8hnh\" not found" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.868156 4766 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a57927d7-7099-4b87-99ee-77aa589cd09f-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.868184 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.871481 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190b0a88-5610-4895-a497-36c4f6c06810" path="/var/lib/kubelet/pods/190b0a88-5610-4895-a497-36c4f6c06810/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.875122 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.886599 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" path="/var/lib/kubelet/pods/2bb7e0f1-e5a8-45ef-9cbf-e308e82968af/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.887982 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" path="/var/lib/kubelet/pods/2ff493c4-bb15-4a40-9499-ca23bf79f42b/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.889932 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a4873b-e4c4-4284-97a6-baa8346e4849" path="/var/lib/kubelet/pods/41a4873b-e4c4-4284-97a6-baa8346e4849/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.892276 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6280c589-1dc7-47f0-9c57-cfdc56dd28ee" path="/var/lib/kubelet/pods/6280c589-1dc7-47f0-9c57-cfdc56dd28ee/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.893229 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a4954a-fbea-4371-bfd0-cb7681daa75e" path="/var/lib/kubelet/pods/74a4954a-fbea-4371-bfd0-cb7681daa75e/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.893450 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data" (OuterVolumeSpecName: "config-data") pod "7ae0a18c-f118-45b5-8989-9ca3a49827ad" (UID: "7ae0a18c-f118-45b5-8989-9ca3a49827ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.893642 4766 generic.go:334] "Generic (PLEG): container finished" podID="d1f2b76e-7443-46d3-a296-76196dcc28b7" containerID="521a97f36468751bc00689dc257329f3eef8640694009d32dbf2669dae3f1809" exitCode=2 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.893833 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c9ae005-e7a2-4af5-9ff6-48f3364c4496" path="/var/lib/kubelet/pods/8c9ae005-e7a2-4af5-9ff6-48f3364c4496/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.894818 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef5df73-cf11-4fae-bd6f-e91cc2f767aa" path="/var/lib/kubelet/pods/8ef5df73-cf11-4fae-bd6f-e91cc2f767aa/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.896142 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba577939-09d4-40a6-b1e7-98f607984111" path="/var/lib/kubelet/pods/ba577939-09d4-40a6-b1e7-98f607984111/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.897731 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf108478-7651-4f37-b0e7-3a571774d030" path="/var/lib/kubelet/pods/bf108478-7651-4f37-b0e7-3a571774d030/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.899276 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04522868-a66d-44f8-a9bb-6f157f26653f" (UID: "04522868-a66d-44f8-a9bb-6f157f26653f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.899457 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de709648-266f-40ff-97e8-8427d71f31b6" path="/var/lib/kubelet/pods/de709648-266f-40ff-97e8-8427d71f31b6/volumes" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.906735 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-config-data" (OuterVolumeSpecName: "config-data") pod "354d9984-d7b5-4540-a96e-a68a7bf1b667" (UID: "354d9984-d7b5-4540-a96e-a68a7bf1b667"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.906943 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "354d9984-d7b5-4540-a96e-a68a7bf1b667" (UID: "354d9984-d7b5-4540-a96e-a68a7bf1b667"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.907024 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:55484->10.217.0.200:8775: read: connection reset by peer" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.908128 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:55482->10.217.0.200:8775: read: connection reset by peer" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.914970 4766 generic.go:334] "Generic (PLEG): container finished" podID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerID="6d71bb4e4b5fbaf252c35fbed7673f51dd06fc7cc2b55da69c2bf9d00916cfed" exitCode=2 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.918122 4766 generic.go:334] "Generic (PLEG): container finished" podID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerID="f77816b92b5912c6b1c7dc678c8c50b01d826733809db88019e22a9ee0865718" exitCode=0 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.918345 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.920420 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.920449 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.959242 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "354d9984-d7b5-4540-a96e-a68a7bf1b667" (UID: "354d9984-d7b5-4540-a96e-a68a7bf1b667"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.969317 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ae0a18c-f118-45b5-8989-9ca3a49827ad-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.969337 4766 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.969349 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.969359 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04522868-a66d-44f8-a9bb-6f157f26653f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.969367 4766 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/354d9984-d7b5-4540-a96e-a68a7bf1b667-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.969403 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.969443 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts podName:604402ee-2350-4cec-8a56-b5203c3287e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:38:17.469429284 +0000 UTC m=+1579.178734710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts") pod "cinderda2b-account-delete-kmsx2" (UID: "604402ee-2350-4cec-8a56-b5203c3287e8") : configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.969938 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.969963 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts podName:8b67744f-1b24-4baf-b397-26cff83c2a4d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:17.469955299 +0000 UTC m=+1579.179260725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts") pod "novacell00dd1-account-delete-flgqj" (UID: "8b67744f-1b24-4baf-b397-26cff83c2a4d") : configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.969986 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:16.970002 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts podName:77345185-b7d7-46d4-9e72-251bac080f3a nodeName:}" failed. No retries permitted until 2025-12-09 03:38:17.46999727 +0000 UTC m=+1579.179302696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts") pod "novaapibaee-account-delete-wvgpg" (UID: "77345185-b7d7-46d4-9e72-251bac080f3a") : configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.986261 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapibaee-account-delete-wvgpg" podStartSLOduration=5.986238667 podStartE2EDuration="5.986238667s" podCreationTimestamp="2025-12-09 03:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:38:16.845490544 +0000 UTC m=+1578.554795960" watchObservedRunningTime="2025-12-09 03:38:16.986238667 +0000 UTC m=+1578.695544093" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.993060 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.171:9292/healthcheck\": dial tcp 10.217.0.171:9292: connect: connection refused" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:16.993278 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.171:9292/healthcheck\": dial tcp 10.217.0.171:9292: connect: connection refused" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.003271 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell00dd1-account-delete-flgqj" podStartSLOduration=6.003251315 podStartE2EDuration="6.003251315s" podCreationTimestamp="2025-12-09 03:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:38:16.861075565 +0000 UTC m=+1578.570381001" watchObservedRunningTime="2025-12-09 03:38:17.003251315 +0000 UTC m=+1578.712556741" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.012457 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinderda2b-account-delete-kmsx2" podStartSLOduration=6.012442284 podStartE2EDuration="6.012442284s" podCreationTimestamp="2025-12-09 03:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 03:38:16.880276581 +0000 UTC m=+1578.589582007" watchObservedRunningTime="2025-12-09 03:38:17.012442284 +0000 UTC m=+1578.721747710" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.056456 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.173:9292/healthcheck\": dial tcp 10.217.0.173:9292: connect: connection refused" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.056515 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.173:9292/healthcheck\": dial tcp 10.217.0.173:9292: connect: connection refused" Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.100365 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a067a547f7da427c71cff18393daf6fa745ddd3b809d72ffbb267a1fb541cd92" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.132827 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77345185_b7d7_46d4_9e72_251bac080f3a.slice/crio-conmon-e72c786adf539230163f0d59c1d7e3330d6b3bfa65aeb081d7df9f3a4a6897e1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77345185_b7d7_46d4_9e72_251bac080f3a.slice/crio-e72c786adf539230163f0d59c1d7e3330d6b3bfa65aeb081d7df9f3a4a6897e1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podece85ffc_6754_4f25_a66c_cf66043196b3.slice/crio-conmon-8f530fae2ed519e55e5f27f969322808cde1ee901eca5360c959594114369789.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod604402ee_2350_4cec_8a56_b5203c3287e8.slice/crio-conmon-9532c7b831e1593a4ce20c4ef7d36aa8ca50e6e5eec7b7417c942f408c559489.scope\": RecentStats: unable to find data in memory cache]" Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.142412 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a067a547f7da427c71cff18393daf6fa745ddd3b809d72ffbb267a1fb541cd92" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.169356 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a067a547f7da427c71cff18393daf6fa745ddd3b809d72ffbb267a1fb541cd92" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.169703 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="507e513c-d987-4fba-8fc0-e5ceff892afe" containerName="nova-scheduler-scheduler" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.221240 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" containerName="galera" containerID="cri-o://0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466" gracePeriod=30 Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.411957 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d9c17096fe1163fbded2f9420441c595a2d9ae76cf7dab37fbf8a31e46afc79e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.414583 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d9c17096fe1163fbded2f9420441c595a2d9ae76cf7dab37fbf8a31e46afc79e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.416992 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d9c17096fe1163fbded2f9420441c595a2d9ae76cf7dab37fbf8a31e46afc79e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.417021 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="ec41cb84-c47e-4199-ac5d-825bbf4f7023" containerName="nova-cell0-conductor-conductor" Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.484737 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.484832 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts podName:604402ee-2350-4cec-8a56-b5203c3287e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:38:18.484815342 +0000 UTC m=+1580.194120768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts") pod "cinderda2b-account-delete-kmsx2" (UID: "604402ee-2350-4cec-8a56-b5203c3287e8") : configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.485130 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.485152 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts podName:8b67744f-1b24-4baf-b397-26cff83c2a4d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:18.485145341 +0000 UTC m=+1580.194450767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts") pod "novacell00dd1-account-delete-flgqj" (UID: "8b67744f-1b24-4baf-b397-26cff83c2a4d") : configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.485181 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.493729 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts podName:77345185-b7d7-46d4-9e72-251bac080f3a nodeName:}" failed. No retries permitted until 2025-12-09 03:38:18.485195812 +0000 UTC m=+1580.194501238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts") pod "novaapibaee-account-delete-wvgpg" (UID: "77345185-b7d7-46d4-9e72-251bac080f3a") : configmap "openstack-scripts" not found Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.647358 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.197:3000/\": read tcp 10.217.0.2:40298->10.217.0.197:3000: read: connection reset by peer" Dec 09 03:38:17 crc kubenswrapper[4766]: E1209 03:38:17.849784 4766 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.009s" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.849822 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderda2b-account-delete-kmsx2" event={"ID":"604402ee-2350-4cec-8a56-b5203c3287e8","Type":"ContainerStarted","Data":"9532c7b831e1593a4ce20c4ef7d36aa8ca50e6e5eec7b7417c942f408c559489"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.849850 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderda2b-account-delete-kmsx2"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.849909 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-59f4fbb654-hrpnd" event={"ID":"7ae0a18c-f118-45b5-8989-9ca3a49827ad","Type":"ContainerDied","Data":"f108bb438c724162f0547e30e4efd9a9875f3b030db705d3b3ecd584c0eb0426"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.849925 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7cv6p"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.849939 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7cv6p"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.849953 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d1f2b76e-7443-46d3-a296-76196dcc28b7","Type":"ContainerDied","Data":"521a97f36468751bc00689dc257329f3eef8640694009d32dbf2669dae3f1809"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.849967 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerDied","Data":"6d71bb4e4b5fbaf252c35fbed7673f51dd06fc7cc2b55da69c2bf9d00916cfed"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.849979 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronc2af-account-delete-zxgmn"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.849989 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9edd6e7b-9841-43be-9478-5e7d06d8bd8d","Type":"ContainerDied","Data":"f77816b92b5912c6b1c7dc678c8c50b01d826733809db88019e22a9ee0865718"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.850000 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c2af-account-create-update-n7gmp"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.850012 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c2af-account-create-update-n7gmp"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.850038 4766 scope.go:117] "RemoveContainer" containerID="6d7c92439814de4f38c3b7f907335c896ebf15dd2cc3e1bf15669b628a996f74" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.859752 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.880611 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.885757 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.908353 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.911517 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.914490 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-59f4fbb654-hrpnd"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.915765 4766 scope.go:117] "RemoveContainer" containerID="0064b87b123267ac3d50f8f1784bd6b3c893066418c3670a19b10112ba20b3b9" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.936642 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-59f4fbb654-hrpnd"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.941501 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.942622 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.942757 4766 generic.go:334] "Generic (PLEG): container finished" podID="604402ee-2350-4cec-8a56-b5203c3287e8" containerID="9532c7b831e1593a4ce20c4ef7d36aa8ca50e6e5eec7b7417c942f408c559489" exitCode=1 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.942837 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderda2b-account-delete-kmsx2" event={"ID":"604402ee-2350-4cec-8a56-b5203c3287e8","Type":"ContainerDied","Data":"9532c7b831e1593a4ce20c4ef7d36aa8ca50e6e5eec7b7417c942f408c559489"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.950361 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.953281 4766 generic.go:334] "Generic (PLEG): container finished" podID="417726b2-75fd-4efc-84ec-803533df86aa" containerID="556527868dac8d175c6e11bca57d484310691f25ac252842802df8da12d3c8f3" exitCode=0 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.953329 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f669bd74-mz8xh" event={"ID":"417726b2-75fd-4efc-84ec-803533df86aa","Type":"ContainerDied","Data":"556527868dac8d175c6e11bca57d484310691f25ac252842802df8da12d3c8f3"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.953563 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.957842 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.960474 4766 generic.go:334] "Generic (PLEG): container finished" podID="1dfb6314-1f18-4e71-947e-534dc1021381" containerID="9ccd949b9d58181646f9d2d30d971f167213b68716379d53e92812bc4142132d" exitCode=0 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.960527 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1dfb6314-1f18-4e71-947e-534dc1021381","Type":"ContainerDied","Data":"9ccd949b9d58181646f9d2d30d971f167213b68716379d53e92812bc4142132d"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.965810 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicand6a3-account-delete-cjwdk" event={"ID":"8f5be126-5890-4cef-aa82-3bdeef1918cd","Type":"ContainerDied","Data":"72d8bba3608afcd65c56a78c71842042d1f3b5ec0095ba80470e00475ba35ae9"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.965854 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72d8bba3608afcd65c56a78c71842042d1f3b5ec0095ba80470e00475ba35ae9" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.965910 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicand6a3-account-delete-cjwdk" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.969079 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.977562 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.984193 4766 generic.go:334] "Generic (PLEG): container finished" podID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerID="7f14118b1949b7f4fa11ccf3e1c0978b97f42aeb4702c3b2333fce0c82a717cf" exitCode=0 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.984253 4766 generic.go:334] "Generic (PLEG): container finished" podID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerID="f4f799754c7d424d9e1a02edb786aa1d853e958a00bbe81227da9a2cf14ca4ee" exitCode=0 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.984266 4766 generic.go:334] "Generic (PLEG): container finished" podID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerID="d74114f82622e892b4a5e3b24618b3e3f03c2d8600dbbfb022a279d06521b94a" exitCode=0 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.984338 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerDied","Data":"7f14118b1949b7f4fa11ccf3e1c0978b97f42aeb4702c3b2333fce0c82a717cf"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.984415 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerDied","Data":"f4f799754c7d424d9e1a02edb786aa1d853e958a00bbe81227da9a2cf14ca4ee"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.984430 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerDied","Data":"d74114f82622e892b4a5e3b24618b3e3f03c2d8600dbbfb022a279d06521b94a"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.991887 4766 generic.go:334] "Generic (PLEG): container finished" podID="8b67744f-1b24-4baf-b397-26cff83c2a4d" containerID="b964f8ee7b5a62d95ec8c21e2f2cb9480262bdc47471ecb02c71bcf732440066" exitCode=1 Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.991982 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00dd1-account-delete-flgqj" event={"ID":"8b67744f-1b24-4baf-b397-26cff83c2a4d","Type":"ContainerDied","Data":"b964f8ee7b5a62d95ec8c21e2f2cb9480262bdc47471ecb02c71bcf732440066"} Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993164 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d1796c-9c3d-444c-bda3-2a7525ac2650-operator-scripts\") pod \"15d1796c-9c3d-444c-bda3-2a7525ac2650\" (UID: \"15d1796c-9c3d-444c-bda3-2a7525ac2650\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993249 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-httpd-run\") pod \"c6a00c8b-af47-4254-83de-a93a975b3afe\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993304 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c6a00c8b-af47-4254-83de-a93a975b3afe\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993328 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-logs\") pod \"c6a00c8b-af47-4254-83de-a93a975b3afe\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993357 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-scripts\") pod \"c6a00c8b-af47-4254-83de-a93a975b3afe\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993412 4766 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell00dd1-account-delete-flgqj" secret="" err="secret \"galera-openstack-dockercfg-g8hnh\" not found" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993449 4766 scope.go:117] "RemoveContainer" containerID="b964f8ee7b5a62d95ec8c21e2f2cb9480262bdc47471ecb02c71bcf732440066" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993450 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl5c9\" (UniqueName: \"kubernetes.io/projected/f227ef39-ddef-411a-96b3-96871679cae1-kube-api-access-bl5c9\") pod \"f227ef39-ddef-411a-96b3-96871679cae1\" (UID: \"f227ef39-ddef-411a-96b3-96871679cae1\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993487 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f227ef39-ddef-411a-96b3-96871679cae1-operator-scripts\") pod \"f227ef39-ddef-411a-96b3-96871679cae1\" (UID: \"f227ef39-ddef-411a-96b3-96871679cae1\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993528 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-combined-ca-bundle\") pod \"c6a00c8b-af47-4254-83de-a93a975b3afe\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993581 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwc7v\" (UniqueName: \"kubernetes.io/projected/c6a00c8b-af47-4254-83de-a93a975b3afe-kube-api-access-hwc7v\") pod \"c6a00c8b-af47-4254-83de-a93a975b3afe\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993608 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x67gk\" (UniqueName: \"kubernetes.io/projected/15d1796c-9c3d-444c-bda3-2a7525ac2650-kube-api-access-x67gk\") pod \"15d1796c-9c3d-444c-bda3-2a7525ac2650\" (UID: \"15d1796c-9c3d-444c-bda3-2a7525ac2650\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993635 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-config-data\") pod \"c6a00c8b-af47-4254-83de-a93a975b3afe\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993672 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-public-tls-certs\") pod \"c6a00c8b-af47-4254-83de-a93a975b3afe\" (UID: \"c6a00c8b-af47-4254-83de-a93a975b3afe\") " Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.993997 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15d1796c-9c3d-444c-bda3-2a7525ac2650-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15d1796c-9c3d-444c-bda3-2a7525ac2650" (UID: "15d1796c-9c3d-444c-bda3-2a7525ac2650"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:17 crc kubenswrapper[4766]: I1209 03:38:17.994207 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d1796c-9c3d-444c-bda3-2a7525ac2650-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.002198 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f227ef39-ddef-411a-96b3-96871679cae1-kube-api-access-bl5c9" (OuterVolumeSpecName: "kube-api-access-bl5c9") pod "f227ef39-ddef-411a-96b3-96871679cae1" (UID: "f227ef39-ddef-411a-96b3-96871679cae1"). InnerVolumeSpecName "kube-api-access-bl5c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.004958 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-logs" (OuterVolumeSpecName: "logs") pod "c6a00c8b-af47-4254-83de-a93a975b3afe" (UID: "c6a00c8b-af47-4254-83de-a93a975b3afe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.005205 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d1796c-9c3d-444c-bda3-2a7525ac2650-kube-api-access-x67gk" (OuterVolumeSpecName: "kube-api-access-x67gk") pod "15d1796c-9c3d-444c-bda3-2a7525ac2650" (UID: "15d1796c-9c3d-444c-bda3-2a7525ac2650"). InnerVolumeSpecName "kube-api-access-x67gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.005820 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f227ef39-ddef-411a-96b3-96871679cae1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f227ef39-ddef-411a-96b3-96871679cae1" (UID: "f227ef39-ddef-411a-96b3-96871679cae1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.006170 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "c6a00c8b-af47-4254-83de-a93a975b3afe" (UID: "c6a00c8b-af47-4254-83de-a93a975b3afe"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.006500 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c6a00c8b-af47-4254-83de-a93a975b3afe" (UID: "c6a00c8b-af47-4254-83de-a93a975b3afe"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.014201 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-scripts" (OuterVolumeSpecName: "scripts") pod "c6a00c8b-af47-4254-83de-a93a975b3afe" (UID: "c6a00c8b-af47-4254-83de-a93a975b3afe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.015540 4766 generic.go:334] "Generic (PLEG): container finished" podID="77345185-b7d7-46d4-9e72-251bac080f3a" containerID="e72c786adf539230163f0d59c1d7e3330d6b3bfa65aeb081d7df9f3a4a6897e1" exitCode=1 Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.015606 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibaee-account-delete-wvgpg" event={"ID":"77345185-b7d7-46d4-9e72-251bac080f3a","Type":"ContainerDied","Data":"e72c786adf539230163f0d59c1d7e3330d6b3bfa65aeb081d7df9f3a4a6897e1"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.017035 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.038305 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9edd6e7b-9841-43be-9478-5e7d06d8bd8d","Type":"ContainerDied","Data":"fbaa25ec7000e8f35a0d95d5eec00b43998c95f95c727becb44ac2ba43f72698"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.038366 4766 scope.go:117] "RemoveContainer" containerID="f77816b92b5912c6b1c7dc678c8c50b01d826733809db88019e22a9ee0865718" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.038526 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.048592 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.050442 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0abc-account-delete-9k2gp" event={"ID":"a764f29a-d427-43a5-833f-34b2064c122d","Type":"ContainerDied","Data":"f964c089d41b85fc1e3d8993a58705d0549a7ab94fd29c88c613be604a83b1ec"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.050555 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f964c089d41b85fc1e3d8993a58705d0549a7ab94fd29c88c613be604a83b1ec" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.050655 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement0abc-account-delete-9k2gp" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.051204 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a00c8b-af47-4254-83de-a93a975b3afe-kube-api-access-hwc7v" (OuterVolumeSpecName: "kube-api-access-hwc7v") pod "c6a00c8b-af47-4254-83de-a93a975b3afe" (UID: "c6a00c8b-af47-4254-83de-a93a975b3afe"). InnerVolumeSpecName "kube-api-access-hwc7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.056182 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6a00c8b-af47-4254-83de-a93a975b3afe" (UID: "c6a00c8b-af47-4254-83de-a93a975b3afe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.060588 4766 generic.go:334] "Generic (PLEG): container finished" podID="5edf46d6-e570-425b-843d-d67f5adde599" containerID="0e789a722d40821da0b1865e918a8aa00c5015bf08b60022e5a34cfa2cf5a712" exitCode=0 Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.060718 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" event={"ID":"5edf46d6-e570-425b-843d-d67f5adde599","Type":"ContainerDied","Data":"0e789a722d40821da0b1865e918a8aa00c5015bf08b60022e5a34cfa2cf5a712"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.060810 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.072573 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.075386 4766 generic.go:334] "Generic (PLEG): container finished" podID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerID="d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40" exitCode=0 Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.075650 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.076251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6a00c8b-af47-4254-83de-a93a975b3afe","Type":"ContainerDied","Data":"d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.076491 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c6a00c8b-af47-4254-83de-a93a975b3afe","Type":"ContainerDied","Data":"55470ae3ef794b26c2d7e26506f8171544701644da67acf668e94742dc50fa32"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.080243 4766 scope.go:117] "RemoveContainer" containerID="60568c7893411368ef7934b6d7f5d2360db7af5753dc4500e120b6309331f0b4" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.092376 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-config-data" (OuterVolumeSpecName: "config-data") pod "c6a00c8b-af47-4254-83de-a93a975b3afe" (UID: "c6a00c8b-af47-4254-83de-a93a975b3afe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.096758 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f5be126-5890-4cef-aa82-3bdeef1918cd-operator-scripts\") pod \"8f5be126-5890-4cef-aa82-3bdeef1918cd\" (UID: \"8f5be126-5890-4cef-aa82-3bdeef1918cd\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.097845 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47770faa-9973-4d81-a630-8c344bcd7b94-etc-machine-id\") pod \"47770faa-9973-4d81-a630-8c344bcd7b94\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.097985 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99d3047-bb16-4bbe-a77d-0f4199121e7d-logs\") pod \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098048 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data\") pod \"47770faa-9973-4d81-a630-8c344bcd7b94\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098155 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-internal-tls-certs\") pod \"47770faa-9973-4d81-a630-8c344bcd7b94\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098310 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a764f29a-d427-43a5-833f-34b2064c122d-operator-scripts\") pod \"a764f29a-d427-43a5-833f-34b2064c122d\" (UID: \"a764f29a-d427-43a5-833f-34b2064c122d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098397 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhxfn\" (UniqueName: \"kubernetes.io/projected/47770faa-9973-4d81-a630-8c344bcd7b94-kube-api-access-mhxfn\") pod \"47770faa-9973-4d81-a630-8c344bcd7b94\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098474 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-combined-ca-bundle\") pod \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098533 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-combined-ca-bundle\") pod \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098591 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktgkw\" (UniqueName: \"kubernetes.io/projected/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-kube-api-access-ktgkw\") pod \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098671 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cbl8\" (UniqueName: \"kubernetes.io/projected/a764f29a-d427-43a5-833f-34b2064c122d-kube-api-access-2cbl8\") pod \"a764f29a-d427-43a5-833f-34b2064c122d\" (UID: \"a764f29a-d427-43a5-833f-34b2064c122d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098738 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-public-tls-certs\") pod \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098812 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnbsl\" (UniqueName: \"kubernetes.io/projected/c99d3047-bb16-4bbe-a77d-0f4199121e7d-kube-api-access-tnbsl\") pod \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098894 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-config-data\") pod \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.098966 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-combined-ca-bundle\") pod \"47770faa-9973-4d81-a630-8c344bcd7b94\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099221 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-logs\") pod \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099324 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-internal-tls-certs\") pod \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099394 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099472 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-scripts\") pod \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099561 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-scripts\") pod \"47770faa-9973-4d81-a630-8c344bcd7b94\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099625 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-internal-tls-certs\") pod \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099692 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-config-data\") pod \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\" (UID: \"c99d3047-bb16-4bbe-a77d-0f4199121e7d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099774 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47770faa-9973-4d81-a630-8c344bcd7b94-logs\") pod \"47770faa-9973-4d81-a630-8c344bcd7b94\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099839 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-public-tls-certs\") pod \"47770faa-9973-4d81-a630-8c344bcd7b94\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.099910 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-httpd-run\") pod \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\" (UID: \"9edd6e7b-9841-43be-9478-5e7d06d8bd8d\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.100140 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slrvb\" (UniqueName: \"kubernetes.io/projected/8f5be126-5890-4cef-aa82-3bdeef1918cd-kube-api-access-slrvb\") pod \"8f5be126-5890-4cef-aa82-3bdeef1918cd\" (UID: \"8f5be126-5890-4cef-aa82-3bdeef1918cd\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.100200 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data-custom\") pod \"47770faa-9973-4d81-a630-8c344bcd7b94\" (UID: \"47770faa-9973-4d81-a630-8c344bcd7b94\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.100676 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwc7v\" (UniqueName: \"kubernetes.io/projected/c6a00c8b-af47-4254-83de-a93a975b3afe-kube-api-access-hwc7v\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.100775 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x67gk\" (UniqueName: \"kubernetes.io/projected/15d1796c-9c3d-444c-bda3-2a7525ac2650-kube-api-access-x67gk\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.100846 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.100898 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.100961 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.101011 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6a00c8b-af47-4254-83de-a93a975b3afe-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.101067 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.101124 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl5c9\" (UniqueName: \"kubernetes.io/projected/f227ef39-ddef-411a-96b3-96871679cae1-kube-api-access-bl5c9\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.101174 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f227ef39-ddef-411a-96b3-96871679cae1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.101239 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.105499 4766 generic.go:334] "Generic (PLEG): container finished" podID="01ada9c6-91af-4717-a157-29070bf61a6e" containerID="d07009fe163e1ff1f0efb68a9304c65721f93731af9b5db45462d77518e37f29" exitCode=0 Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.105608 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"01ada9c6-91af-4717-a157-29070bf61a6e","Type":"ContainerDied","Data":"d07009fe163e1ff1f0efb68a9304c65721f93731af9b5db45462d77518e37f29"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.106804 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99d3047-bb16-4bbe-a77d-0f4199121e7d-logs" (OuterVolumeSpecName: "logs") pod "c99d3047-bb16-4bbe-a77d-0f4199121e7d" (UID: "c99d3047-bb16-4bbe-a77d-0f4199121e7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.106872 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47770faa-9973-4d81-a630-8c344bcd7b94-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "47770faa-9973-4d81-a630-8c344bcd7b94" (UID: "47770faa-9973-4d81-a630-8c344bcd7b94"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.107284 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f5be126-5890-4cef-aa82-3bdeef1918cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f5be126-5890-4cef-aa82-3bdeef1918cd" (UID: "8f5be126-5890-4cef-aa82-3bdeef1918cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.107537 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9edd6e7b-9841-43be-9478-5e7d06d8bd8d" (UID: "9edd6e7b-9841-43be-9478-5e7d06d8bd8d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.109164 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a764f29a-d427-43a5-833f-34b2064c122d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a764f29a-d427-43a5-833f-34b2064c122d" (UID: "a764f29a-d427-43a5-833f-34b2064c122d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.109782 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-logs" (OuterVolumeSpecName: "logs") pod "9edd6e7b-9841-43be-9478-5e7d06d8bd8d" (UID: "9edd6e7b-9841-43be-9478-5e7d06d8bd8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.114037 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47770faa-9973-4d81-a630-8c344bcd7b94-logs" (OuterVolumeSpecName: "logs") pod "47770faa-9973-4d81-a630-8c344bcd7b94" (UID: "47770faa-9973-4d81-a630-8c344bcd7b94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.139471 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a764f29a-d427-43a5-833f-34b2064c122d-kube-api-access-2cbl8" (OuterVolumeSpecName: "kube-api-access-2cbl8") pod "a764f29a-d427-43a5-833f-34b2064c122d" (UID: "a764f29a-d427-43a5-833f-34b2064c122d"). InnerVolumeSpecName "kube-api-access-2cbl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.140782 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "9edd6e7b-9841-43be-9478-5e7d06d8bd8d" (UID: "9edd6e7b-9841-43be-9478-5e7d06d8bd8d"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.141519 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-scripts" (OuterVolumeSpecName: "scripts") pod "9edd6e7b-9841-43be-9478-5e7d06d8bd8d" (UID: "9edd6e7b-9841-43be-9478-5e7d06d8bd8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.159690 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-scripts" (OuterVolumeSpecName: "scripts") pod "47770faa-9973-4d81-a630-8c344bcd7b94" (UID: "47770faa-9973-4d81-a630-8c344bcd7b94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.169475 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99d3047-bb16-4bbe-a77d-0f4199121e7d-kube-api-access-tnbsl" (OuterVolumeSpecName: "kube-api-access-tnbsl") pod "c99d3047-bb16-4bbe-a77d-0f4199121e7d" (UID: "c99d3047-bb16-4bbe-a77d-0f4199121e7d"). InnerVolumeSpecName "kube-api-access-tnbsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.169537 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5be126-5890-4cef-aa82-3bdeef1918cd-kube-api-access-slrvb" (OuterVolumeSpecName: "kube-api-access-slrvb") pod "8f5be126-5890-4cef-aa82-3bdeef1918cd" (UID: "8f5be126-5890-4cef-aa82-3bdeef1918cd"). InnerVolumeSpecName "kube-api-access-slrvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.173431 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.178478 4766 scope.go:117] "RemoveContainer" containerID="0e789a722d40821da0b1865e918a8aa00c5015bf08b60022e5a34cfa2cf5a712" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.178703 4766 generic.go:334] "Generic (PLEG): container finished" podID="ec41cb84-c47e-4199-ac5d-825bbf4f7023" containerID="d9c17096fe1163fbded2f9420441c595a2d9ae76cf7dab37fbf8a31e46afc79e" exitCode=0 Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.178790 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ec41cb84-c47e-4199-ac5d-825bbf4f7023","Type":"ContainerDied","Data":"d9c17096fe1163fbded2f9420441c595a2d9ae76cf7dab37fbf8a31e46afc79e"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.209847 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47770faa-9973-4d81-a630-8c344bcd7b94-kube-api-access-mhxfn" (OuterVolumeSpecName: "kube-api-access-mhxfn") pod "47770faa-9973-4d81-a630-8c344bcd7b94" (UID: "47770faa-9973-4d81-a630-8c344bcd7b94"). InnerVolumeSpecName "kube-api-access-mhxfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.217782 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c6a00c8b-af47-4254-83de-a93a975b3afe" (UID: "c6a00c8b-af47-4254-83de-a93a975b3afe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.219202 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-kube-api-access-ktgkw" (OuterVolumeSpecName: "kube-api-access-ktgkw") pod "9edd6e7b-9841-43be-9478-5e7d06d8bd8d" (UID: "9edd6e7b-9841-43be-9478-5e7d06d8bd8d"). InnerVolumeSpecName "kube-api-access-ktgkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.225508 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-config-data" (OuterVolumeSpecName: "config-data") pod "c99d3047-bb16-4bbe-a77d-0f4199121e7d" (UID: "c99d3047-bb16-4bbe-a77d-0f4199121e7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.226059 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "47770faa-9973-4d81-a630-8c344bcd7b94" (UID: "47770faa-9973-4d81-a630-8c344bcd7b94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246584 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-internal-tls-certs\") pod \"5edf46d6-e570-425b-843d-d67f5adde599\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246662 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzvkc\" (UniqueName: \"kubernetes.io/projected/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-api-access-bzvkc\") pod \"d1f2b76e-7443-46d3-a296-76196dcc28b7\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246734 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-config-data\") pod \"ece85ffc-6754-4f25-a66c-cf66043196b3\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246787 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-combined-ca-bundle\") pod \"d1f2b76e-7443-46d3-a296-76196dcc28b7\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246808 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-combined-ca-bundle\") pod \"ece85ffc-6754-4f25-a66c-cf66043196b3\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246832 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ece85ffc-6754-4f25-a66c-cf66043196b3-logs\") pod \"ece85ffc-6754-4f25-a66c-cf66043196b3\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246891 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-nova-metadata-tls-certs\") pod \"ece85ffc-6754-4f25-a66c-cf66043196b3\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246935 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data\") pod \"5edf46d6-e570-425b-843d-d67f5adde599\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246975 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-combined-ca-bundle\") pod \"5edf46d6-e570-425b-843d-d67f5adde599\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.246994 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-public-tls-certs\") pod \"5edf46d6-e570-425b-843d-d67f5adde599\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247037 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5edf46d6-e570-425b-843d-d67f5adde599-logs\") pod \"5edf46d6-e570-425b-843d-d67f5adde599\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247090 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data-custom\") pod \"5edf46d6-e570-425b-843d-d67f5adde599\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247123 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc7nh\" (UniqueName: \"kubernetes.io/projected/ece85ffc-6754-4f25-a66c-cf66043196b3-kube-api-access-kc7nh\") pod \"ece85ffc-6754-4f25-a66c-cf66043196b3\" (UID: \"ece85ffc-6754-4f25-a66c-cf66043196b3\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247159 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-config\") pod \"d1f2b76e-7443-46d3-a296-76196dcc28b7\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247227 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-certs\") pod \"d1f2b76e-7443-46d3-a296-76196dcc28b7\" (UID: \"d1f2b76e-7443-46d3-a296-76196dcc28b7\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247206 4766 generic.go:334] "Generic (PLEG): container finished" podID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerID="4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16" exitCode=0 Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247251 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldc6d\" (UniqueName: \"kubernetes.io/projected/5edf46d6-e570-425b-843d-d67f5adde599-kube-api-access-ldc6d\") pod \"5edf46d6-e570-425b-843d-d67f5adde599\" (UID: \"5edf46d6-e570-425b-843d-d67f5adde599\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247359 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247848 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99d3047-bb16-4bbe-a77d-0f4199121e7d","Type":"ContainerDied","Data":"4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.247883 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99d3047-bb16-4bbe-a77d-0f4199121e7d","Type":"ContainerDied","Data":"93406039b39958ffc38ff9e1d5f4d92e52d9e962927f8aaf32a6f727c5db1c57"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.248129 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47770faa-9973-4d81-a630-8c344bcd7b94" (UID: "47770faa-9973-4d81-a630-8c344bcd7b94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.248390 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.248412 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.248422 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47770faa-9973-4d81-a630-8c344bcd7b94-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.250716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5edf46d6-e570-425b-843d-d67f5adde599-logs" (OuterVolumeSpecName: "logs") pod "5edf46d6-e570-425b-843d-d67f5adde599" (UID: "5edf46d6-e570-425b-843d-d67f5adde599"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.253050 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ece85ffc-6754-4f25-a66c-cf66043196b3-logs" (OuterVolumeSpecName: "logs") pod "ece85ffc-6754-4f25-a66c-cf66043196b3" (UID: "ece85ffc-6754-4f25-a66c-cf66043196b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.268263 4766 generic.go:334] "Generic (PLEG): container finished" podID="47770faa-9973-4d81-a630-8c344bcd7b94" containerID="03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970" exitCode=0 Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.268427 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"47770faa-9973-4d81-a630-8c344bcd7b94","Type":"ContainerDied","Data":"03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.268462 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"47770faa-9973-4d81-a630-8c344bcd7b94","Type":"ContainerDied","Data":"21f2a02a7d923e4ce0cc1cbd8be6525d1a84e7484dd0587a2325c4d72ba7ab68"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.268470 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.270165 4766 generic.go:334] "Generic (PLEG): container finished" podID="507e513c-d987-4fba-8fc0-e5ceff892afe" containerID="a067a547f7da427c71cff18393daf6fa745ddd3b809d72ffbb267a1fb541cd92" exitCode=0 Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.270279 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"507e513c-d987-4fba-8fc0-e5ceff892afe","Type":"ContainerDied","Data":"a067a547f7da427c71cff18393daf6fa745ddd3b809d72ffbb267a1fb541cd92"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272170 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slrvb\" (UniqueName: \"kubernetes.io/projected/8f5be126-5890-4cef-aa82-3bdeef1918cd-kube-api-access-slrvb\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272368 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272388 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272402 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f5be126-5890-4cef-aa82-3bdeef1918cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272423 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47770faa-9973-4d81-a630-8c344bcd7b94-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272437 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6a00c8b-af47-4254-83de-a93a975b3afe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272450 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99d3047-bb16-4bbe-a77d-0f4199121e7d-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272461 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a764f29a-d427-43a5-833f-34b2064c122d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272480 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhxfn\" (UniqueName: \"kubernetes.io/projected/47770faa-9973-4d81-a630-8c344bcd7b94-kube-api-access-mhxfn\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272511 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktgkw\" (UniqueName: \"kubernetes.io/projected/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-kube-api-access-ktgkw\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272524 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cbl8\" (UniqueName: \"kubernetes.io/projected/a764f29a-d427-43a5-833f-34b2064c122d-kube-api-access-2cbl8\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272536 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnbsl\" (UniqueName: \"kubernetes.io/projected/c99d3047-bb16-4bbe-a77d-0f4199121e7d-kube-api-access-tnbsl\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272547 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272565 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272594 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.272613 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.276433 4766 generic.go:334] "Generic (PLEG): container finished" podID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerID="8f530fae2ed519e55e5f27f969322808cde1ee901eca5360c959594114369789" exitCode=0 Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.276484 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ece85ffc-6754-4f25-a66c-cf66043196b3","Type":"ContainerDied","Data":"8f530fae2ed519e55e5f27f969322808cde1ee901eca5360c959594114369789"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.276559 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.277197 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c99d3047-bb16-4bbe-a77d-0f4199121e7d" (UID: "c99d3047-bb16-4bbe-a77d-0f4199121e7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.277303 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5edf46d6-e570-425b-843d-d67f5adde599-kube-api-access-ldc6d" (OuterVolumeSpecName: "kube-api-access-ldc6d") pod "5edf46d6-e570-425b-843d-d67f5adde599" (UID: "5edf46d6-e570-425b-843d-d67f5adde599"). InnerVolumeSpecName "kube-api-access-ldc6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.277499 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9edd6e7b-9841-43be-9478-5e7d06d8bd8d" (UID: "9edd6e7b-9841-43be-9478-5e7d06d8bd8d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.277639 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece85ffc-6754-4f25-a66c-cf66043196b3-kube-api-access-kc7nh" (OuterVolumeSpecName: "kube-api-access-kc7nh") pod "ece85ffc-6754-4f25-a66c-cf66043196b3" (UID: "ece85ffc-6754-4f25-a66c-cf66043196b3"). InnerVolumeSpecName "kube-api-access-kc7nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.278356 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-api-access-bzvkc" (OuterVolumeSpecName: "kube-api-access-bzvkc") pod "d1f2b76e-7443-46d3-a296-76196dcc28b7" (UID: "d1f2b76e-7443-46d3-a296-76196dcc28b7"). InnerVolumeSpecName "kube-api-access-bzvkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.282904 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance9417-account-delete-lg65s" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.284153 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronc2af-account-delete-zxgmn" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.285705 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronc2af-account-delete-zxgmn" event={"ID":"15d1796c-9c3d-444c-bda3-2a7525ac2650","Type":"ContainerDied","Data":"1518ae646abca79f08536d3239e5b6ffa1e4128f020d18e52bc568634df2d560"} Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.285750 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1518ae646abca79f08536d3239e5b6ffa1e4128f020d18e52bc568634df2d560" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.287425 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5edf46d6-e570-425b-843d-d67f5adde599" (UID: "5edf46d6-e570-425b-843d-d67f5adde599"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.302423 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "47770faa-9973-4d81-a630-8c344bcd7b94" (UID: "47770faa-9973-4d81-a630-8c344bcd7b94"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.313567 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c99d3047-bb16-4bbe-a77d-0f4199121e7d" (UID: "c99d3047-bb16-4bbe-a77d-0f4199121e7d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.324183 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9edd6e7b-9841-43be-9478-5e7d06d8bd8d" (UID: "9edd6e7b-9841-43be-9478-5e7d06d8bd8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.328747 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.331870 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5edf46d6-e570-425b-843d-d67f5adde599" (UID: "5edf46d6-e570-425b-843d-d67f5adde599"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.335982 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ece85ffc-6754-4f25-a66c-cf66043196b3" (UID: "ece85ffc-6754-4f25-a66c-cf66043196b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.345363 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-config-data" (OuterVolumeSpecName: "config-data") pod "9edd6e7b-9841-43be-9478-5e7d06d8bd8d" (UID: "9edd6e7b-9841-43be-9478-5e7d06d8bd8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.363961 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "d1f2b76e-7443-46d3-a296-76196dcc28b7" (UID: "d1f2b76e-7443-46d3-a296-76196dcc28b7"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.363981 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1f2b76e-7443-46d3-a296-76196dcc28b7" (UID: "d1f2b76e-7443-46d3-a296-76196dcc28b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374501 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5edf46d6-e570-425b-843d-d67f5adde599-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374538 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374552 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374564 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc7nh\" (UniqueName: \"kubernetes.io/projected/ece85ffc-6754-4f25-a66c-cf66043196b3-kube-api-access-kc7nh\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374577 4766 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374588 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldc6d\" (UniqueName: \"kubernetes.io/projected/5edf46d6-e570-425b-843d-d67f5adde599-kube-api-access-ldc6d\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374602 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzvkc\" (UniqueName: \"kubernetes.io/projected/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-api-access-bzvkc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374613 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374626 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374638 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374651 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374661 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374671 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ece85ffc-6754-4f25-a66c-cf66043196b3-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374681 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374691 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edd6e7b-9841-43be-9478-5e7d06d8bd8d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374702 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374712 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.374722 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.383909 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "47770faa-9973-4d81-a630-8c344bcd7b94" (UID: "47770faa-9973-4d81-a630-8c344bcd7b94"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.383930 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c99d3047-bb16-4bbe-a77d-0f4199121e7d" (UID: "c99d3047-bb16-4bbe-a77d-0f4199121e7d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.384765 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data" (OuterVolumeSpecName: "config-data") pod "47770faa-9973-4d81-a630-8c344bcd7b94" (UID: "47770faa-9973-4d81-a630-8c344bcd7b94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.397547 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-config-data" (OuterVolumeSpecName: "config-data") pod "ece85ffc-6754-4f25-a66c-cf66043196b3" (UID: "ece85ffc-6754-4f25-a66c-cf66043196b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.406129 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5edf46d6-e570-425b-843d-d67f5adde599" (UID: "5edf46d6-e570-425b-843d-d67f5adde599"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.406162 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data" (OuterVolumeSpecName: "config-data") pod "5edf46d6-e570-425b-843d-d67f5adde599" (UID: "5edf46d6-e570-425b-843d-d67f5adde599"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.417179 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5edf46d6-e570-425b-843d-d67f5adde599" (UID: "5edf46d6-e570-425b-843d-d67f5adde599"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.422611 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "d1f2b76e-7443-46d3-a296-76196dcc28b7" (UID: "d1f2b76e-7443-46d3-a296-76196dcc28b7"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.440753 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ece85ffc-6754-4f25-a66c-cf66043196b3" (UID: "ece85ffc-6754-4f25-a66c-cf66043196b3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.477293 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.477327 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.477338 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.477346 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47770faa-9973-4d81-a630-8c344bcd7b94-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.477356 4766 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f2b76e-7443-46d3-a296-76196dcc28b7-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.477366 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5edf46d6-e570-425b-843d-d67f5adde599-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.477374 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.477381 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99d3047-bb16-4bbe-a77d-0f4199121e7d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.477391 4766 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ece85ffc-6754-4f25-a66c-cf66043196b3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.579342 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.579389 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.579432 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.579412 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts podName:604402ee-2350-4cec-8a56-b5203c3287e8 nodeName:}" failed. No retries permitted until 2025-12-09 03:38:20.579398228 +0000 UTC m=+1582.288703654 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts") pod "cinderda2b-account-delete-kmsx2" (UID: "604402ee-2350-4cec-8a56-b5203c3287e8") : configmap "openstack-scripts" not found Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.579511 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts podName:8b67744f-1b24-4baf-b397-26cff83c2a4d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:20.579490181 +0000 UTC m=+1582.288795617 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts") pod "novacell00dd1-account-delete-flgqj" (UID: "8b67744f-1b24-4baf-b397-26cff83c2a4d") : configmap "openstack-scripts" not found Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.579534 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts podName:77345185-b7d7-46d4-9e72-251bac080f3a nodeName:}" failed. No retries permitted until 2025-12-09 03:38:20.579521162 +0000 UTC m=+1582.288826608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts") pod "novaapibaee-account-delete-wvgpg" (UID: "77345185-b7d7-46d4-9e72-251bac080f3a") : configmap "openstack-scripts" not found Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.629841 4766 scope.go:117] "RemoveContainer" containerID="7f712ce3fa09ea2b5ba75d7f96299ac239eca89fd085f13671911969bcb7a465" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.639030 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.680282 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-combined-ca-bundle\") pod \"1dfb6314-1f18-4e71-947e-534dc1021381\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.680320 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-config-data\") pod \"1dfb6314-1f18-4e71-947e-534dc1021381\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.680457 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z46qq\" (UniqueName: \"kubernetes.io/projected/1dfb6314-1f18-4e71-947e-534dc1021381-kube-api-access-z46qq\") pod \"1dfb6314-1f18-4e71-947e-534dc1021381\" (UID: \"1dfb6314-1f18-4e71-947e-534dc1021381\") " Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.680836 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.680878 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data podName:3af438c1-d0b9-4ecb-bb88-a0efd14736a4 nodeName:}" failed. No retries permitted until 2025-12-09 03:38:26.680865052 +0000 UTC m=+1588.390170478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data") pod "rabbitmq-cell1-server-0" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4") : configmap "rabbitmq-cell1-config-data" not found Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.699205 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dfb6314-1f18-4e71-947e-534dc1021381-kube-api-access-z46qq" (OuterVolumeSpecName: "kube-api-access-z46qq") pod "1dfb6314-1f18-4e71-947e-534dc1021381" (UID: "1dfb6314-1f18-4e71-947e-534dc1021381"). InnerVolumeSpecName "kube-api-access-z46qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.701325 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.704038 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dfb6314-1f18-4e71-947e-534dc1021381" (UID: "1dfb6314-1f18-4e71-947e-534dc1021381"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.716980 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.724511 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.728766 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.731145 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-config-data" (OuterVolumeSpecName: "config-data") pod "1dfb6314-1f18-4e71-947e-534dc1021381" (UID: "1dfb6314-1f18-4e71-947e-534dc1021381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.736735 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.737330 4766 scope.go:117] "RemoveContainer" containerID="d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.742974 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.745925 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.752076 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronc2af-account-delete-zxgmn"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.759963 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutronc2af-account-delete-zxgmn"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.762748 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.771555 4766 scope.go:117] "RemoveContainer" containerID="4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.777980 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781343 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-config-data\") pod \"417726b2-75fd-4efc-84ec-803533df86aa\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781393 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-config-data\") pod \"507e513c-d987-4fba-8fc0-e5ceff892afe\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781424 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-scripts\") pod \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781455 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-internal-tls-certs\") pod \"417726b2-75fd-4efc-84ec-803533df86aa\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781489 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-combined-ca-bundle\") pod \"417726b2-75fd-4efc-84ec-803533df86aa\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781530 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-run-httpd\") pod \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781555 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-log-httpd\") pod \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781580 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-public-tls-certs\") pod \"417726b2-75fd-4efc-84ec-803533df86aa\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781604 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-ceilometer-tls-certs\") pod \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781631 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-combined-ca-bundle\") pod \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781650 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-sg-core-conf-yaml\") pod \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.781666 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68nf2\" (UniqueName: \"kubernetes.io/projected/54e4f191-0150-4bdb-9afa-2cc5164c6b55-kube-api-access-68nf2\") pod \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.782524 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-config-data\") pod \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\" (UID: \"54e4f191-0150-4bdb-9afa-2cc5164c6b55\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.782598 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jnx7\" (UniqueName: \"kubernetes.io/projected/507e513c-d987-4fba-8fc0-e5ceff892afe-kube-api-access-4jnx7\") pod \"507e513c-d987-4fba-8fc0-e5ceff892afe\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.782696 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "54e4f191-0150-4bdb-9afa-2cc5164c6b55" (UID: "54e4f191-0150-4bdb-9afa-2cc5164c6b55"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.783518 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "54e4f191-0150-4bdb-9afa-2cc5164c6b55" (UID: "54e4f191-0150-4bdb-9afa-2cc5164c6b55"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.783826 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-combined-ca-bundle\") pod \"507e513c-d987-4fba-8fc0-e5ceff892afe\" (UID: \"507e513c-d987-4fba-8fc0-e5ceff892afe\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.783876 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/417726b2-75fd-4efc-84ec-803533df86aa-logs\") pod \"417726b2-75fd-4efc-84ec-803533df86aa\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.783903 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-scripts\") pod \"417726b2-75fd-4efc-84ec-803533df86aa\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.783949 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvwz7\" (UniqueName: \"kubernetes.io/projected/417726b2-75fd-4efc-84ec-803533df86aa-kube-api-access-fvwz7\") pod \"417726b2-75fd-4efc-84ec-803533df86aa\" (UID: \"417726b2-75fd-4efc-84ec-803533df86aa\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.785499 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.785529 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dfb6314-1f18-4e71-947e-534dc1021381-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.785541 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z46qq\" (UniqueName: \"kubernetes.io/projected/1dfb6314-1f18-4e71-947e-534dc1021381-kube-api-access-z46qq\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.785555 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.785567 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54e4f191-0150-4bdb-9afa-2cc5164c6b55-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.793139 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/417726b2-75fd-4efc-84ec-803533df86aa-kube-api-access-fvwz7" (OuterVolumeSpecName: "kube-api-access-fvwz7") pod "417726b2-75fd-4efc-84ec-803533df86aa" (UID: "417726b2-75fd-4efc-84ec-803533df86aa"). InnerVolumeSpecName "kube-api-access-fvwz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.796469 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507e513c-d987-4fba-8fc0-e5ceff892afe-kube-api-access-4jnx7" (OuterVolumeSpecName: "kube-api-access-4jnx7") pod "507e513c-d987-4fba-8fc0-e5ceff892afe" (UID: "507e513c-d987-4fba-8fc0-e5ceff892afe"). InnerVolumeSpecName "kube-api-access-4jnx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.804999 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.815718 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e4f191-0150-4bdb-9afa-2cc5164c6b55-kube-api-access-68nf2" (OuterVolumeSpecName: "kube-api-access-68nf2") pod "54e4f191-0150-4bdb-9afa-2cc5164c6b55" (UID: "54e4f191-0150-4bdb-9afa-2cc5164c6b55"). InnerVolumeSpecName "kube-api-access-68nf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.815792 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-scripts" (OuterVolumeSpecName: "scripts") pod "54e4f191-0150-4bdb-9afa-2cc5164c6b55" (UID: "54e4f191-0150-4bdb-9afa-2cc5164c6b55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.820058 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/417726b2-75fd-4efc-84ec-803533df86aa-logs" (OuterVolumeSpecName: "logs") pod "417726b2-75fd-4efc-84ec-803533df86aa" (UID: "417726b2-75fd-4efc-84ec-803533df86aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.820455 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.833445 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-config-data" (OuterVolumeSpecName: "config-data") pod "507e513c-d987-4fba-8fc0-e5ceff892afe" (UID: "507e513c-d987-4fba-8fc0-e5ceff892afe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.835024 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.843799 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.847847 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-scripts" (OuterVolumeSpecName: "scripts") pod "417726b2-75fd-4efc-84ec-803533df86aa" (UID: "417726b2-75fd-4efc-84ec-803533df86aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.851129 4766 scope.go:117] "RemoveContainer" containerID="d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40" Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.869955 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40\": container with ID starting with d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40 not found: ID does not exist" containerID="d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.869991 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40"} err="failed to get container status \"d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40\": rpc error: code = NotFound desc = could not find container \"d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40\": container with ID starting with d432c923bb6e5d026d188f99535cf05c6edc04bf3115025e1b13ce0d868c8e40 not found: ID does not exist" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.870012 4766 scope.go:117] "RemoveContainer" containerID="4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271" Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.889254 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271\": container with ID starting with 4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271 not found: ID does not exist" containerID="4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.889302 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271"} err="failed to get container status \"4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271\": rpc error: code = NotFound desc = could not find container \"4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271\": container with ID starting with 4607a483883c86340cfbac97828e166c3da4857e38370adb0540caa38f0eb271 not found: ID does not exist" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.889329 4766 scope.go:117] "RemoveContainer" containerID="4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.889607 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d1796c-9c3d-444c-bda3-2a7525ac2650" path="/var/lib/kubelet/pods/15d1796c-9c3d-444c-bda3-2a7525ac2650/volumes" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.889980 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frvrg\" (UniqueName: \"kubernetes.io/projected/604402ee-2350-4cec-8a56-b5203c3287e8-kube-api-access-frvrg\") pod \"604402ee-2350-4cec-8a56-b5203c3287e8\" (UID: \"604402ee-2350-4cec-8a56-b5203c3287e8\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890179 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gntm\" (UniqueName: \"kubernetes.io/projected/01ada9c6-91af-4717-a157-29070bf61a6e-kube-api-access-4gntm\") pod \"01ada9c6-91af-4717-a157-29070bf61a6e\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890230 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-config-data\") pod \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890251 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-memcached-tls-certs\") pod \"01ada9c6-91af-4717-a157-29070bf61a6e\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890270 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-combined-ca-bundle\") pod \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890303 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts\") pod \"604402ee-2350-4cec-8a56-b5203c3287e8\" (UID: \"604402ee-2350-4cec-8a56-b5203c3287e8\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890328 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-combined-ca-bundle\") pod \"01ada9c6-91af-4717-a157-29070bf61a6e\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890358 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rnzx\" (UniqueName: \"kubernetes.io/projected/77345185-b7d7-46d4-9e72-251bac080f3a-kube-api-access-8rnzx\") pod \"77345185-b7d7-46d4-9e72-251bac080f3a\" (UID: \"77345185-b7d7-46d4-9e72-251bac080f3a\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890380 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxbhp\" (UniqueName: \"kubernetes.io/projected/ec41cb84-c47e-4199-ac5d-825bbf4f7023-kube-api-access-mxbhp\") pod \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\" (UID: \"ec41cb84-c47e-4199-ac5d-825bbf4f7023\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890404 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-config-data\") pod \"01ada9c6-91af-4717-a157-29070bf61a6e\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890423 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts\") pod \"77345185-b7d7-46d4-9e72-251bac080f3a\" (UID: \"77345185-b7d7-46d4-9e72-251bac080f3a\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890438 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-kolla-config\") pod \"01ada9c6-91af-4717-a157-29070bf61a6e\" (UID: \"01ada9c6-91af-4717-a157-29070bf61a6e\") " Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890492 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32828f3e-2f89-41d4-abcd-d4a433a53db1" path="/var/lib/kubelet/pods/32828f3e-2f89-41d4-abcd-d4a433a53db1/volumes" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890720 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890734 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68nf2\" (UniqueName: \"kubernetes.io/projected/54e4f191-0150-4bdb-9afa-2cc5164c6b55-kube-api-access-68nf2\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890745 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jnx7\" (UniqueName: \"kubernetes.io/projected/507e513c-d987-4fba-8fc0-e5ceff892afe-kube-api-access-4jnx7\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890756 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/417726b2-75fd-4efc-84ec-803533df86aa-logs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890764 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890774 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvwz7\" (UniqueName: \"kubernetes.io/projected/417726b2-75fd-4efc-84ec-803533df86aa-kube-api-access-fvwz7\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.890782 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.891735 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354d9984-d7b5-4540-a96e-a68a7bf1b667" path="/var/lib/kubelet/pods/354d9984-d7b5-4540-a96e-a68a7bf1b667/volumes" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.892615 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "604402ee-2350-4cec-8a56-b5203c3287e8" (UID: "604402ee-2350-4cec-8a56-b5203c3287e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.894859 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77345185-b7d7-46d4-9e72-251bac080f3a" (UID: "77345185-b7d7-46d4-9e72-251bac080f3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.895318 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "01ada9c6-91af-4717-a157-29070bf61a6e" (UID: "01ada9c6-91af-4717-a157-29070bf61a6e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.895794 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-config-data" (OuterVolumeSpecName: "config-data") pod "01ada9c6-91af-4717-a157-29070bf61a6e" (UID: "01ada9c6-91af-4717-a157-29070bf61a6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.897412 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" path="/var/lib/kubelet/pods/7ae0a18c-f118-45b5-8989-9ca3a49827ad/volumes" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.907168 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a57927d7-7099-4b87-99ee-77aa589cd09f" path="/var/lib/kubelet/pods/a57927d7-7099-4b87-99ee-77aa589cd09f/volumes" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.909644 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" path="/var/lib/kubelet/pods/c6a00c8b-af47-4254-83de-a93a975b3afe/volumes" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.911255 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" path="/var/lib/kubelet/pods/c99d3047-bb16-4bbe-a77d-0f4199121e7d/volumes" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.912632 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc2d2d4c-93e7-437e-854f-e768a62c04ee" path="/var/lib/kubelet/pods/dc2d2d4c-93e7-437e-854f-e768a62c04ee/volumes" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.913343 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" path="/var/lib/kubelet/pods/ece85ffc-6754-4f25-a66c-cf66043196b3/volumes" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.917964 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604402ee-2350-4cec-8a56-b5203c3287e8-kube-api-access-frvrg" (OuterVolumeSpecName: "kube-api-access-frvrg") pod "604402ee-2350-4cec-8a56-b5203c3287e8" (UID: "604402ee-2350-4cec-8a56-b5203c3287e8"). InnerVolumeSpecName "kube-api-access-frvrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.941273 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77345185-b7d7-46d4-9e72-251bac080f3a-kube-api-access-8rnzx" (OuterVolumeSpecName: "kube-api-access-8rnzx") pod "77345185-b7d7-46d4-9e72-251bac080f3a" (UID: "77345185-b7d7-46d4-9e72-251bac080f3a"). InnerVolumeSpecName "kube-api-access-8rnzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.941185 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ada9c6-91af-4717-a157-29070bf61a6e-kube-api-access-4gntm" (OuterVolumeSpecName: "kube-api-access-4gntm") pod "01ada9c6-91af-4717-a157-29070bf61a6e" (UID: "01ada9c6-91af-4717-a157-29070bf61a6e"). InnerVolumeSpecName "kube-api-access-4gntm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.946044 4766 scope.go:117] "RemoveContainer" containerID="73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.946268 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec41cb84-c47e-4199-ac5d-825bbf4f7023-kube-api-access-mxbhp" (OuterVolumeSpecName: "kube-api-access-mxbhp") pod "ec41cb84-c47e-4199-ac5d-825bbf4f7023" (UID: "ec41cb84-c47e-4199-ac5d-825bbf4f7023"). InnerVolumeSpecName "kube-api-access-mxbhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.954119 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "417726b2-75fd-4efc-84ec-803533df86aa" (UID: "417726b2-75fd-4efc-84ec-803533df86aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.968150 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.968188 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.968206 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.968234 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.968244 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b9bc9ddf8-vdj7m"] Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.973686 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-config-data" (OuterVolumeSpecName: "config-data") pod "ec41cb84-c47e-4199-ac5d-825bbf4f7023" (UID: "ec41cb84-c47e-4199-ac5d-825bbf4f7023"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.974779 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "507e513c-d987-4fba-8fc0-e5ceff892afe" (UID: "507e513c-d987-4fba-8fc0-e5ceff892afe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.990103 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.994945 4766 scope.go:117] "RemoveContainer" containerID="4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16" Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.995531 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16\": container with ID starting with 4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16 not found: ID does not exist" containerID="4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.997258 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16"} err="failed to get container status \"4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16\": rpc error: code = NotFound desc = could not find container \"4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16\": container with ID starting with 4e9cec2eb1461900d25457444214c4d45464dfe4e1c7dae66c91455d27d90c16 not found: ID does not exist" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.997395 4766 scope.go:117] "RemoveContainer" containerID="73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.996313 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/604402ee-2350-4cec-8a56-b5203c3287e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.997658 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/507e513c-d987-4fba-8fc0-e5ceff892afe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.997729 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rnzx\" (UniqueName: \"kubernetes.io/projected/77345185-b7d7-46d4-9e72-251bac080f3a-kube-api-access-8rnzx\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.997926 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxbhp\" (UniqueName: \"kubernetes.io/projected/ec41cb84-c47e-4199-ac5d-825bbf4f7023-kube-api-access-mxbhp\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.997993 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.998059 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77345185-b7d7-46d4-9e72-251bac080f3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.998130 4766 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01ada9c6-91af-4717-a157-29070bf61a6e-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.998199 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frvrg\" (UniqueName: \"kubernetes.io/projected/604402ee-2350-4cec-8a56-b5203c3287e8-kube-api-access-frvrg\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.998332 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.998426 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gntm\" (UniqueName: \"kubernetes.io/projected/01ada9c6-91af-4717-a157-29070bf61a6e-kube-api-access-4gntm\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.998504 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.995920 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "54e4f191-0150-4bdb-9afa-2cc5164c6b55" (UID: "54e4f191-0150-4bdb-9afa-2cc5164c6b55"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: E1209 03:38:18.999069 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23\": container with ID starting with 73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23 not found: ID does not exist" containerID="73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.999080 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "54e4f191-0150-4bdb-9afa-2cc5164c6b55" (UID: "54e4f191-0150-4bdb-9afa-2cc5164c6b55"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.999100 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23"} err="failed to get container status \"73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23\": rpc error: code = NotFound desc = could not find container \"73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23\": container with ID starting with 73061c67def10906ba85f4983b32303a4696ea6b0964eebbf5357f2a15441b23 not found: ID does not exist" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.999124 4766 scope.go:117] "RemoveContainer" containerID="03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970" Dec 09 03:38:18 crc kubenswrapper[4766]: I1209 03:38:18.999128 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "417726b2-75fd-4efc-84ec-803533df86aa" (UID: "417726b2-75fd-4efc-84ec-803533df86aa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.001808 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7b9bc9ddf8-vdj7m"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.005858 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01ada9c6-91af-4717-a157-29070bf61a6e" (UID: "01ada9c6-91af-4717-a157-29070bf61a6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.007366 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "417726b2-75fd-4efc-84ec-803533df86aa" (UID: "417726b2-75fd-4efc-84ec-803533df86aa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.015353 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec41cb84-c47e-4199-ac5d-825bbf4f7023" (UID: "ec41cb84-c47e-4199-ac5d-825bbf4f7023"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.015717 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="48862672-08e2-4ac6-86a3-57d84bbc868d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.024131 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "01ada9c6-91af-4717-a157-29070bf61a6e" (UID: "01ada9c6-91af-4717-a157-29070bf61a6e"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.039578 4766 scope.go:117] "RemoveContainer" containerID="fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.057896 4766 scope.go:117] "RemoveContainer" containerID="03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970" Dec 09 03:38:19 crc kubenswrapper[4766]: E1209 03:38:19.058505 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970\": container with ID starting with 03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970 not found: ID does not exist" containerID="03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.058579 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970"} err="failed to get container status \"03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970\": rpc error: code = NotFound desc = could not find container \"03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970\": container with ID starting with 03084366c8a20c586a945bb9530204e73533e4690d9f1dd708879f952a6ec970 not found: ID does not exist" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.058617 4766 scope.go:117] "RemoveContainer" containerID="fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67" Dec 09 03:38:19 crc kubenswrapper[4766]: E1209 03:38:19.058937 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67\": container with ID starting with fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67 not found: ID does not exist" containerID="fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.058985 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67"} err="failed to get container status \"fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67\": rpc error: code = NotFound desc = could not find container \"fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67\": container with ID starting with fd65daf6097cba30789fab9ce91ab0968d8508608dd8cb9d8d430336d425be67 not found: ID does not exist" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.059179 4766 scope.go:117] "RemoveContainer" containerID="8f530fae2ed519e55e5f27f969322808cde1ee901eca5360c959594114369789" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.077449 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-config-data" (OuterVolumeSpecName: "config-data") pod "417726b2-75fd-4efc-84ec-803533df86aa" (UID: "417726b2-75fd-4efc-84ec-803533df86aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.080453 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-config-data" (OuterVolumeSpecName: "config-data") pod "54e4f191-0150-4bdb-9afa-2cc5164c6b55" (UID: "54e4f191-0150-4bdb-9afa-2cc5164c6b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.080823 4766 scope.go:117] "RemoveContainer" containerID="33f5f8cb257a0368fe88133631597afd0d390af9af8575a7cd312d29bca8a767" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.086927 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54e4f191-0150-4bdb-9afa-2cc5164c6b55" (UID: "54e4f191-0150-4bdb-9afa-2cc5164c6b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099498 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099517 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099526 4766 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099534 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099542 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099550 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e4f191-0150-4bdb-9afa-2cc5164c6b55-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099558 4766 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099566 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec41cb84-c47e-4199-ac5d-825bbf4f7023-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099574 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ada9c6-91af-4717-a157-29070bf61a6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.099581 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/417726b2-75fd-4efc-84ec-803533df86aa-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.301620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"01ada9c6-91af-4717-a157-29070bf61a6e","Type":"ContainerDied","Data":"6bf1c9777693a010170643b0ca7f092e72290fe3f862ff533b489ecae3969912"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.301664 4766 scope.go:117] "RemoveContainer" containerID="d07009fe163e1ff1f0efb68a9304c65721f93731af9b5db45462d77518e37f29" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.301764 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.314813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9f669bd74-mz8xh" event={"ID":"417726b2-75fd-4efc-84ec-803533df86aa","Type":"ContainerDied","Data":"d991dad51a32ef6a3113dae9632673595f4c0d7407f9c9ca225f6f7068803079"} Dec 09 03:38:19 crc kubenswrapper[4766]: E1209 03:38:19.314942 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466 is running failed: container process not found" containerID="0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.314879 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9f669bd74-mz8xh" Dec 09 03:38:19 crc kubenswrapper[4766]: E1209 03:38:19.315293 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466 is running failed: container process not found" containerID="0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 09 03:38:19 crc kubenswrapper[4766]: E1209 03:38:19.315530 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466 is running failed: container process not found" containerID="0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 09 03:38:19 crc kubenswrapper[4766]: E1209 03:38:19.315557 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" containerName="galera" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.317582 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibaee-account-delete-wvgpg" event={"ID":"77345185-b7d7-46d4-9e72-251bac080f3a","Type":"ContainerDied","Data":"d66e936afde637eec05db5d0f8188e162f3620b11a8361f5764713663ae26831"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.317668 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibaee-account-delete-wvgpg" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.320353 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ec41cb84-c47e-4199-ac5d-825bbf4f7023","Type":"ContainerDied","Data":"1c3761e21b8f09276f63e9f5a8e2c9438d29c4a9e0f21e68be9f120fabb3f02e"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.320443 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.335725 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.336293 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"507e513c-d987-4fba-8fc0-e5ceff892afe","Type":"ContainerDied","Data":"4c403ef6fd2f9ac5cfbdba2eb7dc37395e124f01c1e4be74d23567bd872ea6e2"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.341400 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderda2b-account-delete-kmsx2" event={"ID":"604402ee-2350-4cec-8a56-b5203c3287e8","Type":"ContainerDied","Data":"1abee6058ecbf6347063a4c2be175b5b3754936923625fc26ba3b16a8bb825e0"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.341466 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderda2b-account-delete-kmsx2" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.347819 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00dd1-account-delete-flgqj" event={"ID":"8b67744f-1b24-4baf-b397-26cff83c2a4d","Type":"ContainerStarted","Data":"0676e56c92cfee8a1d0d596375582bc5791b8e314503bd6233f9da366126d99f"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.348393 4766 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell00dd1-account-delete-flgqj" secret="" err="secret \"galera-openstack-dockercfg-g8hnh\" not found" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.356687 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.358558 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1dfb6314-1f18-4e71-947e-534dc1021381","Type":"ContainerDied","Data":"6027d6b4989aa9f0c12e49a78aff088e0e757cd4f6e7f56393c2ea903c89e979"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.368959 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.371367 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54e4f191-0150-4bdb-9afa-2cc5164c6b55","Type":"ContainerDied","Data":"6dc105c8b5ca5bf8bcc72f794af15cdec8e465fa04414c427ef2eae773c478cc"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.375317 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_45ff249a-854d-4c30-8216-b7bd9482e08c/ovn-northd/0.log" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.375458 4766 generic.go:334] "Generic (PLEG): container finished" podID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerID="24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333" exitCode=139 Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.375552 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"45ff249a-854d-4c30-8216-b7bd9482e08c","Type":"ContainerDied","Data":"24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.376880 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d1f2b76e-7443-46d3-a296-76196dcc28b7","Type":"ContainerDied","Data":"be40d3f66ebf0fe4dd930179531e52cbdd602c3bb334f395b760f59c8ba17487"} Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.377021 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.442634 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_45ff249a-854d-4c30-8216-b7bd9482e08c/ovn-northd/0.log" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.442705 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.483990 4766 scope.go:117] "RemoveContainer" containerID="556527868dac8d175c6e11bca57d484310691f25ac252842802df8da12d3c8f3" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.489235 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibaee-account-delete-wvgpg"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.511037 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-config\") pod \"45ff249a-854d-4c30-8216-b7bd9482e08c\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.511146 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-rundir\") pod \"45ff249a-854d-4c30-8216-b7bd9482e08c\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.511182 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94jzf\" (UniqueName: \"kubernetes.io/projected/45ff249a-854d-4c30-8216-b7bd9482e08c-kube-api-access-94jzf\") pod \"45ff249a-854d-4c30-8216-b7bd9482e08c\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.511201 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-scripts\") pod \"45ff249a-854d-4c30-8216-b7bd9482e08c\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.511245 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-northd-tls-certs\") pod \"45ff249a-854d-4c30-8216-b7bd9482e08c\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.511283 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-combined-ca-bundle\") pod \"45ff249a-854d-4c30-8216-b7bd9482e08c\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.511307 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-metrics-certs-tls-certs\") pod \"45ff249a-854d-4c30-8216-b7bd9482e08c\" (UID: \"45ff249a-854d-4c30-8216-b7bd9482e08c\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.511580 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-config" (OuterVolumeSpecName: "config") pod "45ff249a-854d-4c30-8216-b7bd9482e08c" (UID: "45ff249a-854d-4c30-8216-b7bd9482e08c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.511789 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.512140 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "45ff249a-854d-4c30-8216-b7bd9482e08c" (UID: "45ff249a-854d-4c30-8216-b7bd9482e08c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.513271 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-scripts" (OuterVolumeSpecName: "scripts") pod "45ff249a-854d-4c30-8216-b7bd9482e08c" (UID: "45ff249a-854d-4c30-8216-b7bd9482e08c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.513315 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapibaee-account-delete-wvgpg"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.523233 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ff249a-854d-4c30-8216-b7bd9482e08c-kube-api-access-94jzf" (OuterVolumeSpecName: "kube-api-access-94jzf") pod "45ff249a-854d-4c30-8216-b7bd9482e08c" (UID: "45ff249a-854d-4c30-8216-b7bd9482e08c"). InnerVolumeSpecName "kube-api-access-94jzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.546802 4766 scope.go:117] "RemoveContainer" containerID="c3015d64e08f82a88af0129948492bc3c6ad857b57c9483bd887eeb673aa3dc3" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.551765 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.590985 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "45ff249a-854d-4c30-8216-b7bd9482e08c" (UID: "45ff249a-854d-4c30-8216-b7bd9482e08c"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.593084 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45ff249a-854d-4c30-8216-b7bd9482e08c" (UID: "45ff249a-854d-4c30-8216-b7bd9482e08c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.595946 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.608458 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.613434 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.613456 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94jzf\" (UniqueName: \"kubernetes.io/projected/45ff249a-854d-4c30-8216-b7bd9482e08c-kube-api-access-94jzf\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.613467 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ff249a-854d-4c30-8216-b7bd9482e08c-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.613475 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.613483 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.615834 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.632647 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.639884 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "45ff249a-854d-4c30-8216-b7bd9482e08c" (UID: "45ff249a-854d-4c30-8216-b7bd9482e08c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.654350 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.658935 4766 scope.go:117] "RemoveContainer" containerID="e72c786adf539230163f0d59c1d7e3330d6b3bfa65aeb081d7df9f3a4a6897e1" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.660580 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderda2b-account-delete-kmsx2"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.667368 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinderda2b-account-delete-kmsx2"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.677151 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.685827 4766 scope.go:117] "RemoveContainer" containerID="d9c17096fe1163fbded2f9420441c595a2d9ae76cf7dab37fbf8a31e46afc79e" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.685957 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.695001 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9f669bd74-mz8xh"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.710116 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9f669bd74-mz8xh"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.713866 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.714925 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ff249a-854d-4c30-8216-b7bd9482e08c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: E1209 03:38:19.714959 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 09 03:38:19 crc kubenswrapper[4766]: E1209 03:38:19.715018 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data podName:48862672-08e2-4ac6-86a3-57d84bbc868d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:27.715000569 +0000 UTC m=+1589.424305995 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data") pod "rabbitmq-server-0" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d") : configmap "rabbitmq-config-data" not found Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.735888 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.745494 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.754897 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.755335 4766 scope.go:117] "RemoveContainer" containerID="a067a547f7da427c71cff18393daf6fa745ddd3b809d72ffbb267a1fb541cd92" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.810866 4766 scope.go:117] "RemoveContainer" containerID="9532c7b831e1593a4ce20c4ef7d36aa8ca50e6e5eec7b7417c942f408c559489" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.812652 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.841996 4766 scope.go:117] "RemoveContainer" containerID="9ccd949b9d58181646f9d2d30d971f167213b68716379d53e92812bc4142132d" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.875877 4766 scope.go:117] "RemoveContainer" containerID="7f14118b1949b7f4fa11ccf3e1c0978b97f42aeb4702c3b2333fce0c82a717cf" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.901790 4766 scope.go:117] "RemoveContainer" containerID="6d71bb4e4b5fbaf252c35fbed7673f51dd06fc7cc2b55da69c2bf9d00916cfed" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917102 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-default\") pod \"26d1d344-fbf5-415d-952e-9ee50493a134\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917220 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bktq8\" (UniqueName: \"kubernetes.io/projected/26d1d344-fbf5-415d-952e-9ee50493a134-kube-api-access-bktq8\") pod \"26d1d344-fbf5-415d-952e-9ee50493a134\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917262 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-galera-tls-certs\") pod \"26d1d344-fbf5-415d-952e-9ee50493a134\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917293 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"26d1d344-fbf5-415d-952e-9ee50493a134\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917338 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-combined-ca-bundle\") pod \"26d1d344-fbf5-415d-952e-9ee50493a134\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917416 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-kolla-config\") pod \"26d1d344-fbf5-415d-952e-9ee50493a134\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917458 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-generated\") pod \"26d1d344-fbf5-415d-952e-9ee50493a134\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917508 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-operator-scripts\") pod \"26d1d344-fbf5-415d-952e-9ee50493a134\" (UID: \"26d1d344-fbf5-415d-952e-9ee50493a134\") " Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917627 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "26d1d344-fbf5-415d-952e-9ee50493a134" (UID: "26d1d344-fbf5-415d-952e-9ee50493a134"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.917958 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.918729 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "26d1d344-fbf5-415d-952e-9ee50493a134" (UID: "26d1d344-fbf5-415d-952e-9ee50493a134"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.919308 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "26d1d344-fbf5-415d-952e-9ee50493a134" (UID: "26d1d344-fbf5-415d-952e-9ee50493a134"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.920163 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26d1d344-fbf5-415d-952e-9ee50493a134" (UID: "26d1d344-fbf5-415d-952e-9ee50493a134"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.921169 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d1d344-fbf5-415d-952e-9ee50493a134-kube-api-access-bktq8" (OuterVolumeSpecName: "kube-api-access-bktq8") pod "26d1d344-fbf5-415d-952e-9ee50493a134" (UID: "26d1d344-fbf5-415d-952e-9ee50493a134"). InnerVolumeSpecName "kube-api-access-bktq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.922354 4766 scope.go:117] "RemoveContainer" containerID="f4f799754c7d424d9e1a02edb786aa1d853e958a00bbe81227da9a2cf14ca4ee" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.926537 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "26d1d344-fbf5-415d-952e-9ee50493a134" (UID: "26d1d344-fbf5-415d-952e-9ee50493a134"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.945804 4766 scope.go:117] "RemoveContainer" containerID="d74114f82622e892b4a5e3b24618b3e3f03c2d8600dbbfb022a279d06521b94a" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.947408 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26d1d344-fbf5-415d-952e-9ee50493a134" (UID: "26d1d344-fbf5-415d-952e-9ee50493a134"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.970313 4766 scope.go:117] "RemoveContainer" containerID="521a97f36468751bc00689dc257329f3eef8640694009d32dbf2669dae3f1809" Dec 09 03:38:19 crc kubenswrapper[4766]: I1209 03:38:19.995670 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "26d1d344-fbf5-415d-952e-9ee50493a134" (UID: "26d1d344-fbf5-415d-952e-9ee50493a134"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.019181 4766 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.019237 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.019248 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d1d344-fbf5-415d-952e-9ee50493a134-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.019256 4766 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.019266 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/26d1d344-fbf5-415d-952e-9ee50493a134-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.019275 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26d1d344-fbf5-415d-952e-9ee50493a134-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.019283 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bktq8\" (UniqueName: \"kubernetes.io/projected/26d1d344-fbf5-415d-952e-9ee50493a134-kube-api-access-bktq8\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.034666 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.121281 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.247237 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.323776 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-pod-info\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.323862 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-erlang-cookie\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.323887 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gn76\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-kube-api-access-9gn76\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.323903 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-confd\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.323931 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.323965 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.324038 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-erlang-cookie-secret\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.324056 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-plugins-conf\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.324079 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-server-conf\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.324141 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-tls\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.324167 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-plugins\") pod \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\" (UID: \"3af438c1-d0b9-4ecb-bb88-a0efd14736a4\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.325088 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.325917 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.326787 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.328945 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-pod-info" (OuterVolumeSpecName: "pod-info") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.342488 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.342559 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-kube-api-access-9gn76" (OuterVolumeSpecName: "kube-api-access-9gn76") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "kube-api-access-9gn76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.343390 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.344336 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.358573 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data" (OuterVolumeSpecName: "config-data") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.378371 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-server-conf" (OuterVolumeSpecName: "server-conf") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.404484 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_45ff249a-854d-4c30-8216-b7bd9482e08c/ovn-northd/0.log" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.404669 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"45ff249a-854d-4c30-8216-b7bd9482e08c","Type":"ContainerDied","Data":"9c1a2b9f960d6adbf0b4615bd89e14438352374dc11fc786a519c3297ce595e5"} Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.404759 4766 scope.go:117] "RemoveContainer" containerID="bd3d08a30c68e8b92264c99f7fd188c185cb8959a30c11e9fd15099288bfbc41" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.405055 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.420914 4766 generic.go:334] "Generic (PLEG): container finished" podID="18f03bec-d533-450d-b79b-7f19dc436d94" containerID="425a19b5cf582ca012bbb5a8e1a477b968972c4db75ac0ca49551bb4ae484803" exitCode=0 Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.420990 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdb76977-cn2gb" event={"ID":"18f03bec-d533-450d-b79b-7f19dc436d94","Type":"ContainerDied","Data":"425a19b5cf582ca012bbb5a8e1a477b968972c4db75ac0ca49551bb4ae484803"} Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.428713 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.429031 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3af438c1-d0b9-4ecb-bb88-a0efd14736a4","Type":"ContainerDied","Data":"962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18"} Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.428546 4766 generic.go:334] "Generic (PLEG): container finished" podID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" containerID="962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18" exitCode=0 Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.432820 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3af438c1-d0b9-4ecb-bb88-a0efd14736a4","Type":"ContainerDied","Data":"8b97799afc465daad45141bb9b70082bb279b1eac37bfa687c4d9ed8695a8f71"} Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433534 4766 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433547 4766 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433555 4766 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433563 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433571 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433578 4766 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433587 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433595 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gn76\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-kube-api-access-9gn76\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433613 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.433621 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.459235 4766 generic.go:334] "Generic (PLEG): container finished" podID="48862672-08e2-4ac6-86a3-57d84bbc868d" containerID="99fdba038c7e2b9ae61670d79759f6a9042c470a8ddf322cdd4ad3f61ffc2528" exitCode=0 Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.459295 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48862672-08e2-4ac6-86a3-57d84bbc868d","Type":"ContainerDied","Data":"99fdba038c7e2b9ae61670d79759f6a9042c470a8ddf322cdd4ad3f61ffc2528"} Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.461527 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.475757 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.477744 4766 generic.go:334] "Generic (PLEG): container finished" podID="26d1d344-fbf5-415d-952e-9ee50493a134" containerID="0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466" exitCode=0 Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.477790 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26d1d344-fbf5-415d-952e-9ee50493a134","Type":"ContainerDied","Data":"0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466"} Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.477811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"26d1d344-fbf5-415d-952e-9ee50493a134","Type":"ContainerDied","Data":"19b739acbecbbb03849c91514c71d12e40e3ff07ece2edc6b168f1df0dc03926"} Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.477915 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.487656 4766 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell00dd1-account-delete-flgqj" secret="" err="secret \"galera-openstack-dockercfg-g8hnh\" not found" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.489556 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.498042 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3af438c1-d0b9-4ecb-bb88-a0efd14736a4" (UID: "3af438c1-d0b9-4ecb-bb88-a0efd14736a4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.520804 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.528455 4766 scope.go:117] "RemoveContainer" containerID="24930a35936c9c9541a851424b5da48a1934ae42086233e461a5cadb9e7c9333" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.530535 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.534947 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3af438c1-d0b9-4ecb-bb88-a0efd14736a4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.534969 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.542304 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.554936 4766 scope.go:117] "RemoveContainer" containerID="962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.578003 4766 scope.go:117] "RemoveContainer" containerID="89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.603821 4766 scope.go:117] "RemoveContainer" containerID="962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18" Dec 09 03:38:20 crc kubenswrapper[4766]: E1209 03:38:20.604762 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18\": container with ID starting with 962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18 not found: ID does not exist" containerID="962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.604801 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18"} err="failed to get container status \"962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18\": rpc error: code = NotFound desc = could not find container \"962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18\": container with ID starting with 962568f97a5d1d74446b1792cc32c267f5d0b01f9b07b07fb677e6320810cd18 not found: ID does not exist" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.604831 4766 scope.go:117] "RemoveContainer" containerID="89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238" Dec 09 03:38:20 crc kubenswrapper[4766]: E1209 03:38:20.605194 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238\": container with ID starting with 89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238 not found: ID does not exist" containerID="89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.605275 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238"} err="failed to get container status \"89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238\": rpc error: code = NotFound desc = could not find container \"89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238\": container with ID starting with 89fd925cf8e2e6af226638c75a081b35a25b6dffcc72733ea434db0449aab238 not found: ID does not exist" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.605305 4766 scope.go:117] "RemoveContainer" containerID="0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.622424 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.635038 4766 scope.go:117] "RemoveContainer" containerID="43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.635278 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="354d9984-d7b5-4540-a96e-a68a7bf1b667" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.194:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.639448 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-public-tls-certs\") pod \"18f03bec-d533-450d-b79b-7f19dc436d94\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.639542 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-config-data\") pod \"18f03bec-d533-450d-b79b-7f19dc436d94\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.639570 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zq2b\" (UniqueName: \"kubernetes.io/projected/18f03bec-d533-450d-b79b-7f19dc436d94-kube-api-access-2zq2b\") pod \"18f03bec-d533-450d-b79b-7f19dc436d94\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.639635 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-fernet-keys\") pod \"18f03bec-d533-450d-b79b-7f19dc436d94\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.639658 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-internal-tls-certs\") pod \"18f03bec-d533-450d-b79b-7f19dc436d94\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.639732 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-credential-keys\") pod \"18f03bec-d533-450d-b79b-7f19dc436d94\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.639773 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-scripts\") pod \"18f03bec-d533-450d-b79b-7f19dc436d94\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.639803 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-combined-ca-bundle\") pod \"18f03bec-d533-450d-b79b-7f19dc436d94\" (UID: \"18f03bec-d533-450d-b79b-7f19dc436d94\") " Dec 09 03:38:20 crc kubenswrapper[4766]: E1209 03:38:20.640297 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:20 crc kubenswrapper[4766]: E1209 03:38:20.640359 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts podName:8b67744f-1b24-4baf-b397-26cff83c2a4d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:24.640341846 +0000 UTC m=+1586.349647272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts") pod "novacell00dd1-account-delete-flgqj" (UID: "8b67744f-1b24-4baf-b397-26cff83c2a4d") : configmap "openstack-scripts" not found Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.648606 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-scripts" (OuterVolumeSpecName: "scripts") pod "18f03bec-d533-450d-b79b-7f19dc436d94" (UID: "18f03bec-d533-450d-b79b-7f19dc436d94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.664348 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "18f03bec-d533-450d-b79b-7f19dc436d94" (UID: "18f03bec-d533-450d-b79b-7f19dc436d94"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.664458 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f03bec-d533-450d-b79b-7f19dc436d94-kube-api-access-2zq2b" (OuterVolumeSpecName: "kube-api-access-2zq2b") pod "18f03bec-d533-450d-b79b-7f19dc436d94" (UID: "18f03bec-d533-450d-b79b-7f19dc436d94"). InnerVolumeSpecName "kube-api-access-2zq2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.668730 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "18f03bec-d533-450d-b79b-7f19dc436d94" (UID: "18f03bec-d533-450d-b79b-7f19dc436d94"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.673534 4766 scope.go:117] "RemoveContainer" containerID="0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466" Dec 09 03:38:20 crc kubenswrapper[4766]: E1209 03:38:20.673971 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466\": container with ID starting with 0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466 not found: ID does not exist" containerID="0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.674022 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466"} err="failed to get container status \"0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466\": rpc error: code = NotFound desc = could not find container \"0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466\": container with ID starting with 0df65d048c83a7ca498b618bc1f72c8d53f6bb9cf4a7ab255430a4ea10451466 not found: ID does not exist" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.674053 4766 scope.go:117] "RemoveContainer" containerID="43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133" Dec 09 03:38:20 crc kubenswrapper[4766]: E1209 03:38:20.674469 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133\": container with ID starting with 43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133 not found: ID does not exist" containerID="43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.674506 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133"} err="failed to get container status \"43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133\": rpc error: code = NotFound desc = could not find container \"43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133\": container with ID starting with 43950b2abfa0cf79d2aa106ac6854aecb02098ab754c6b653d6a9242214ed133 not found: ID does not exist" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.688604 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18f03bec-d533-450d-b79b-7f19dc436d94" (UID: "18f03bec-d533-450d-b79b-7f19dc436d94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.689306 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-config-data" (OuterVolumeSpecName: "config-data") pod "18f03bec-d533-450d-b79b-7f19dc436d94" (UID: "18f03bec-d533-450d-b79b-7f19dc436d94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.694113 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "18f03bec-d533-450d-b79b-7f19dc436d94" (UID: "18f03bec-d533-450d-b79b-7f19dc436d94"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.694526 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "18f03bec-d533-450d-b79b-7f19dc436d94" (UID: "18f03bec-d533-450d-b79b-7f19dc436d94"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.741869 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.741915 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-tls\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.741949 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9qmk\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-kube-api-access-z9qmk\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.741995 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48862672-08e2-4ac6-86a3-57d84bbc868d-erlang-cookie-secret\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742029 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742095 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-confd\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742119 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-plugins\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742228 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48862672-08e2-4ac6-86a3-57d84bbc868d-pod-info\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742279 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-plugins-conf\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742302 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-server-conf\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742342 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-erlang-cookie\") pod \"48862672-08e2-4ac6-86a3-57d84bbc868d\" (UID: \"48862672-08e2-4ac6-86a3-57d84bbc868d\") " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742632 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742654 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742669 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742681 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742691 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742703 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742714 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f03bec-d533-450d-b79b-7f19dc436d94-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.742724 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zq2b\" (UniqueName: \"kubernetes.io/projected/18f03bec-d533-450d-b79b-7f19dc436d94-kube-api-access-2zq2b\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.743112 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.743593 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.743955 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.744791 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.747627 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/48862672-08e2-4ac6-86a3-57d84bbc868d-pod-info" (OuterVolumeSpecName: "pod-info") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.749130 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48862672-08e2-4ac6-86a3-57d84bbc868d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.749883 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.750141 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-kube-api-access-z9qmk" (OuterVolumeSpecName: "kube-api-access-z9qmk") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "kube-api-access-z9qmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.766159 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.772809 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.773272 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data" (OuterVolumeSpecName: "config-data") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.783130 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-server-conf" (OuterVolumeSpecName: "server-conf") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.820160 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "48862672-08e2-4ac6-86a3-57d84bbc868d" (UID: "48862672-08e2-4ac6-86a3-57d84bbc868d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843612 4766 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48862672-08e2-4ac6-86a3-57d84bbc868d-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843641 4766 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843651 4766 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843660 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843669 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48862672-08e2-4ac6-86a3-57d84bbc868d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843677 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843684 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9qmk\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-kube-api-access-z9qmk\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843692 4766 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48862672-08e2-4ac6-86a3-57d84bbc868d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843716 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843725 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.843733 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48862672-08e2-4ac6-86a3-57d84bbc868d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.848497 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ada9c6-91af-4717-a157-29070bf61a6e" path="/var/lib/kubelet/pods/01ada9c6-91af-4717-a157-29070bf61a6e/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.849042 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dfb6314-1f18-4e71-947e-534dc1021381" path="/var/lib/kubelet/pods/1dfb6314-1f18-4e71-947e-534dc1021381/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.849659 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" path="/var/lib/kubelet/pods/26d1d344-fbf5-415d-952e-9ee50493a134/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.851046 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" path="/var/lib/kubelet/pods/3af438c1-d0b9-4ecb-bb88-a0efd14736a4/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.851750 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="417726b2-75fd-4efc-84ec-803533df86aa" path="/var/lib/kubelet/pods/417726b2-75fd-4efc-84ec-803533df86aa/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.852861 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" path="/var/lib/kubelet/pods/45ff249a-854d-4c30-8216-b7bd9482e08c/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.853520 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47770faa-9973-4d81-a630-8c344bcd7b94" path="/var/lib/kubelet/pods/47770faa-9973-4d81-a630-8c344bcd7b94/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.854594 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="507e513c-d987-4fba-8fc0-e5ceff892afe" path="/var/lib/kubelet/pods/507e513c-d987-4fba-8fc0-e5ceff892afe/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.855536 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" path="/var/lib/kubelet/pods/54e4f191-0150-4bdb-9afa-2cc5164c6b55/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.856369 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5edf46d6-e570-425b-843d-d67f5adde599" path="/var/lib/kubelet/pods/5edf46d6-e570-425b-843d-d67f5adde599/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.856896 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604402ee-2350-4cec-8a56-b5203c3287e8" path="/var/lib/kubelet/pods/604402ee-2350-4cec-8a56-b5203c3287e8/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.857200 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.857815 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77345185-b7d7-46d4-9e72-251bac080f3a" path="/var/lib/kubelet/pods/77345185-b7d7-46d4-9e72-251bac080f3a/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.858393 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" path="/var/lib/kubelet/pods/9edd6e7b-9841-43be-9478-5e7d06d8bd8d/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.859032 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f2b76e-7443-46d3-a296-76196dcc28b7" path="/var/lib/kubelet/pods/d1f2b76e-7443-46d3-a296-76196dcc28b7/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.860202 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec41cb84-c47e-4199-ac5d-825bbf4f7023" path="/var/lib/kubelet/pods/ec41cb84-c47e-4199-ac5d-825bbf4f7023/volumes" Dec 09 03:38:20 crc kubenswrapper[4766]: I1209 03:38:20.946673 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.049279 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2ljqg"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.067985 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2ljqg"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.079037 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance9417-account-delete-lg65s"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.084853 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9417-account-create-update-59kg8"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.090316 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9417-account-create-update-59kg8"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.096106 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance9417-account-delete-lg65s"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.140932 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vgqln"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.150872 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vgqln"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.169789 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicand6a3-account-delete-cjwdk"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.176229 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicand6a3-account-delete-cjwdk"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.181791 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d6a3-account-create-update-jllzv"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.187401 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d6a3-account-create-update-jllzv"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.244458 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-cqs82"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.256786 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-cqs82"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.271379 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0abc-account-create-update-dxlcc"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.277757 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement0abc-account-delete-9k2gp"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.285251 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0abc-account-create-update-dxlcc"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.289057 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement0abc-account-delete-9k2gp"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.501868 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.501874 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48862672-08e2-4ac6-86a3-57d84bbc868d","Type":"ContainerDied","Data":"599f1cb825e907d0488c9144a6849ea3cfe74d4ef43106b360d7bce05eaa8d65"} Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.501910 4766 scope.go:117] "RemoveContainer" containerID="99fdba038c7e2b9ae61670d79759f6a9042c470a8ddf322cdd4ad3f61ffc2528" Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.503777 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdb76977-cn2gb" event={"ID":"18f03bec-d533-450d-b79b-7f19dc436d94","Type":"ContainerDied","Data":"d1ef62b942ff38e301503c2836131d0103803379b2c2058684505eb228a90159"} Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.503842 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdb76977-cn2gb" Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.535412 4766 scope.go:117] "RemoveContainer" containerID="953d3d0d70a4b3d2cd1bec6f32e0bdb69cc885c56c1891b2288ff895a80a5c71" Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.549820 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.569280 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.580427 4766 scope.go:117] "RemoveContainer" containerID="425a19b5cf582ca012bbb5a8e1a477b968972c4db75ac0ca49551bb4ae484803" Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.595419 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qx64j"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.599781 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qx64j"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.605426 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5fdb76977-cn2gb"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.611525 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5fdb76977-cn2gb"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.619533 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-0dd1-account-create-update-wk7mx"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.630604 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-0dd1-account-create-update-wk7mx"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.636331 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell00dd1-account-delete-flgqj"] Dec 09 03:38:21 crc kubenswrapper[4766]: I1209 03:38:21.636497 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell00dd1-account-delete-flgqj" podUID="8b67744f-1b24-4baf-b397-26cff83c2a4d" containerName="mariadb-account-delete" containerID="cri-o://0676e56c92cfee8a1d0d596375582bc5791b8e314503bd6233f9da366126d99f" gracePeriod=30 Dec 09 03:38:21 crc kubenswrapper[4766]: E1209 03:38:21.670438 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:21 crc kubenswrapper[4766]: E1209 03:38:21.670835 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:21 crc kubenswrapper[4766]: E1209 03:38:21.671415 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:21 crc kubenswrapper[4766]: E1209 03:38:21.671439 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:21 crc kubenswrapper[4766]: E1209 03:38:21.671510 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" Dec 09 03:38:21 crc kubenswrapper[4766]: E1209 03:38:21.673689 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:21 crc kubenswrapper[4766]: E1209 03:38:21.675076 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:21 crc kubenswrapper[4766]: E1209 03:38:21.675123 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.716056 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": dial tcp 10.217.0.155:9311: i/o timeout" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.716056 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b9bc9ddf8-vdj7m" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.850448 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f03bec-d533-450d-b79b-7f19dc436d94" path="/var/lib/kubelet/pods/18f03bec-d533-450d-b79b-7f19dc436d94/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.850998 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d03cec-423f-4fd0-b08e-17dfaa4cc8d6" path="/var/lib/kubelet/pods/26d03cec-423f-4fd0-b08e-17dfaa4cc8d6/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.851782 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48862672-08e2-4ac6-86a3-57d84bbc868d" path="/var/lib/kubelet/pods/48862672-08e2-4ac6-86a3-57d84bbc868d/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.853573 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67a711ad-556a-4ead-a674-dc1ad1522e94" path="/var/lib/kubelet/pods/67a711ad-556a-4ead-a674-dc1ad1522e94/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.854041 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1cdb70-62c7-4d17-8c11-4f9f40fe8032" path="/var/lib/kubelet/pods/6b1cdb70-62c7-4d17-8c11-4f9f40fe8032/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.854558 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e3b76d0-577e-4c52-93b9-23325fc37634" path="/var/lib/kubelet/pods/6e3b76d0-577e-4c52-93b9-23325fc37634/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.855428 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e65ef84-392e-49dd-8f6d-ed4562e8524a" path="/var/lib/kubelet/pods/6e65ef84-392e-49dd-8f6d-ed4562e8524a/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.856017 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85c956d0-404a-4be0-a2ba-27706be74ec0" path="/var/lib/kubelet/pods/85c956d0-404a-4be0-a2ba-27706be74ec0/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.857201 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5be126-5890-4cef-aa82-3bdeef1918cd" path="/var/lib/kubelet/pods/8f5be126-5890-4cef-aa82-3bdeef1918cd/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.857759 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a764f29a-d427-43a5-833f-34b2064c122d" path="/var/lib/kubelet/pods/a764f29a-d427-43a5-833f-34b2064c122d/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.858607 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc4fff1-4a0a-44ab-af32-edad395bef00" path="/var/lib/kubelet/pods/cdc4fff1-4a0a-44ab-af32-edad395bef00/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.859059 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66f5276-1f41-4add-b998-70930f40b6a7" path="/var/lib/kubelet/pods/d66f5276-1f41-4add-b998-70930f40b6a7/volumes" Dec 09 03:38:22 crc kubenswrapper[4766]: I1209 03:38:22.859562 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f227ef39-ddef-411a-96b3-96871679cae1" path="/var/lib/kubelet/pods/f227ef39-ddef-411a-96b3-96871679cae1/volumes" Dec 09 03:38:24 crc kubenswrapper[4766]: E1209 03:38:24.704195 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:24 crc kubenswrapper[4766]: E1209 03:38:24.704518 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts podName:8b67744f-1b24-4baf-b397-26cff83c2a4d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:32.704503133 +0000 UTC m=+1594.413808559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts") pod "novacell00dd1-account-delete-flgqj" (UID: "8b67744f-1b24-4baf-b397-26cff83c2a4d") : configmap "openstack-scripts" not found Dec 09 03:38:25 crc kubenswrapper[4766]: I1209 03:38:25.142887 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-649b876b57-9jd5p" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.148:9696/\": dial tcp 10.217.0.148:9696: connect: connection refused" Dec 09 03:38:26 crc kubenswrapper[4766]: E1209 03:38:26.670095 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:26 crc kubenswrapper[4766]: E1209 03:38:26.672246 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:26 crc kubenswrapper[4766]: E1209 03:38:26.672416 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:26 crc kubenswrapper[4766]: E1209 03:38:26.672852 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:26 crc kubenswrapper[4766]: E1209 03:38:26.672900 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" Dec 09 03:38:26 crc kubenswrapper[4766]: E1209 03:38:26.674346 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:26 crc kubenswrapper[4766]: E1209 03:38:26.675501 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:26 crc kubenswrapper[4766]: E1209 03:38:26.675676 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" Dec 09 03:38:27 crc kubenswrapper[4766]: I1209 03:38:27.838928 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:38:27 crc kubenswrapper[4766]: E1209 03:38:27.839539 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.611989 4766 generic.go:334] "Generic (PLEG): container finished" podID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerID="7d778b40fe61554f26c1ca0d67e81146904e79a7748a3d0c87bbd49972278c52" exitCode=0 Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.612028 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649b876b57-9jd5p" event={"ID":"be23a05e-591f-4bdf-9c5f-8ee930181397","Type":"ContainerDied","Data":"7d778b40fe61554f26c1ca0d67e81146904e79a7748a3d0c87bbd49972278c52"} Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.699690 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.776961 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tplz2\" (UniqueName: \"kubernetes.io/projected/be23a05e-591f-4bdf-9c5f-8ee930181397-kube-api-access-tplz2\") pod \"be23a05e-591f-4bdf-9c5f-8ee930181397\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.777105 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-internal-tls-certs\") pod \"be23a05e-591f-4bdf-9c5f-8ee930181397\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.777137 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-ovndb-tls-certs\") pod \"be23a05e-591f-4bdf-9c5f-8ee930181397\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.777171 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-httpd-config\") pod \"be23a05e-591f-4bdf-9c5f-8ee930181397\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.777251 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-combined-ca-bundle\") pod \"be23a05e-591f-4bdf-9c5f-8ee930181397\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.777277 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-config\") pod \"be23a05e-591f-4bdf-9c5f-8ee930181397\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.777315 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-public-tls-certs\") pod \"be23a05e-591f-4bdf-9c5f-8ee930181397\" (UID: \"be23a05e-591f-4bdf-9c5f-8ee930181397\") " Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.782338 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "be23a05e-591f-4bdf-9c5f-8ee930181397" (UID: "be23a05e-591f-4bdf-9c5f-8ee930181397"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.797432 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be23a05e-591f-4bdf-9c5f-8ee930181397-kube-api-access-tplz2" (OuterVolumeSpecName: "kube-api-access-tplz2") pod "be23a05e-591f-4bdf-9c5f-8ee930181397" (UID: "be23a05e-591f-4bdf-9c5f-8ee930181397"). InnerVolumeSpecName "kube-api-access-tplz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.814771 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "be23a05e-591f-4bdf-9c5f-8ee930181397" (UID: "be23a05e-591f-4bdf-9c5f-8ee930181397"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.821826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "be23a05e-591f-4bdf-9c5f-8ee930181397" (UID: "be23a05e-591f-4bdf-9c5f-8ee930181397"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.822985 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be23a05e-591f-4bdf-9c5f-8ee930181397" (UID: "be23a05e-591f-4bdf-9c5f-8ee930181397"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.826347 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-config" (OuterVolumeSpecName: "config") pod "be23a05e-591f-4bdf-9c5f-8ee930181397" (UID: "be23a05e-591f-4bdf-9c5f-8ee930181397"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.840471 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "be23a05e-591f-4bdf-9c5f-8ee930181397" (UID: "be23a05e-591f-4bdf-9c5f-8ee930181397"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.879494 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.879531 4766 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.879541 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.879549 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.879557 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-config\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.879566 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be23a05e-591f-4bdf-9c5f-8ee930181397-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:28 crc kubenswrapper[4766]: I1209 03:38:28.879575 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tplz2\" (UniqueName: \"kubernetes.io/projected/be23a05e-591f-4bdf-9c5f-8ee930181397-kube-api-access-tplz2\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:29 crc kubenswrapper[4766]: I1209 03:38:29.624968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649b876b57-9jd5p" event={"ID":"be23a05e-591f-4bdf-9c5f-8ee930181397","Type":"ContainerDied","Data":"1ae9831093f4629cfe9946e8fa58ca65550ce675bd1fdbc21ffcbd735654d2e5"} Dec 09 03:38:29 crc kubenswrapper[4766]: I1209 03:38:29.625026 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-649b876b57-9jd5p" Dec 09 03:38:29 crc kubenswrapper[4766]: I1209 03:38:29.625063 4766 scope.go:117] "RemoveContainer" containerID="17927a86e4710dcdbcd88c8f33269d8edc8c5332c71c20da2f6297f153f510c1" Dec 09 03:38:29 crc kubenswrapper[4766]: I1209 03:38:29.651378 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-649b876b57-9jd5p"] Dec 09 03:38:29 crc kubenswrapper[4766]: I1209 03:38:29.657360 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-649b876b57-9jd5p"] Dec 09 03:38:29 crc kubenswrapper[4766]: I1209 03:38:29.663826 4766 scope.go:117] "RemoveContainer" containerID="7d778b40fe61554f26c1ca0d67e81146904e79a7748a3d0c87bbd49972278c52" Dec 09 03:38:30 crc kubenswrapper[4766]: I1209 03:38:30.856624 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" path="/var/lib/kubelet/pods/be23a05e-591f-4bdf-9c5f-8ee930181397/volumes" Dec 09 03:38:31 crc kubenswrapper[4766]: E1209 03:38:31.670792 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:31 crc kubenswrapper[4766]: E1209 03:38:31.671885 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:31 crc kubenswrapper[4766]: E1209 03:38:31.672093 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:31 crc kubenswrapper[4766]: E1209 03:38:31.672378 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:31 crc kubenswrapper[4766]: E1209 03:38:31.672431 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" Dec 09 03:38:31 crc kubenswrapper[4766]: E1209 03:38:31.674240 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:31 crc kubenswrapper[4766]: E1209 03:38:31.675959 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:31 crc kubenswrapper[4766]: E1209 03:38:31.675996 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" Dec 09 03:38:32 crc kubenswrapper[4766]: E1209 03:38:32.760361 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:32 crc kubenswrapper[4766]: E1209 03:38:32.760781 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts podName:8b67744f-1b24-4baf-b397-26cff83c2a4d nodeName:}" failed. No retries permitted until 2025-12-09 03:38:48.760759028 +0000 UTC m=+1610.470064454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts") pod "novacell00dd1-account-delete-flgqj" (UID: "8b67744f-1b24-4baf-b397-26cff83c2a4d") : configmap "openstack-scripts" not found Dec 09 03:38:36 crc kubenswrapper[4766]: E1209 03:38:36.670362 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:36 crc kubenswrapper[4766]: E1209 03:38:36.671399 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:36 crc kubenswrapper[4766]: E1209 03:38:36.671435 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:36 crc kubenswrapper[4766]: E1209 03:38:36.672428 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:36 crc kubenswrapper[4766]: E1209 03:38:36.672487 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" Dec 09 03:38:36 crc kubenswrapper[4766]: E1209 03:38:36.673050 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:36 crc kubenswrapper[4766]: E1209 03:38:36.675365 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:36 crc kubenswrapper[4766]: E1209 03:38:36.675414 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" Dec 09 03:38:41 crc kubenswrapper[4766]: E1209 03:38:41.671958 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:41 crc kubenswrapper[4766]: E1209 03:38:41.672056 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48 is running failed: container process not found" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:41 crc kubenswrapper[4766]: E1209 03:38:41.673030 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:41 crc kubenswrapper[4766]: E1209 03:38:41.673051 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48 is running failed: container process not found" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:41 crc kubenswrapper[4766]: E1209 03:38:41.673915 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 09 03:38:41 crc kubenswrapper[4766]: E1209 03:38:41.673999 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" Dec 09 03:38:41 crc kubenswrapper[4766]: E1209 03:38:41.674091 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48 is running failed: container process not found" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 09 03:38:41 crc kubenswrapper[4766]: E1209 03:38:41.674146 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-h9kbs" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" Dec 09 03:38:41 crc kubenswrapper[4766]: I1209 03:38:41.812511 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h9kbs_149434b0-ace1-4e8f-9be4-76eb650f7c7f/ovs-vswitchd/0.log" Dec 09 03:38:41 crc kubenswrapper[4766]: I1209 03:38:41.813620 4766 generic.go:334] "Generic (PLEG): container finished" podID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" exitCode=137 Dec 09 03:38:41 crc kubenswrapper[4766]: I1209 03:38:41.813716 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h9kbs" event={"ID":"149434b0-ace1-4e8f-9be4-76eb650f7c7f","Type":"ContainerDied","Data":"1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48"} Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.187658 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h9kbs_149434b0-ace1-4e8f-9be4-76eb650f7c7f/ovs-vswitchd/0.log" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.188875 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230402 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-etc-ovs\") pod \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230511 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "149434b0-ace1-4e8f-9be4-76eb650f7c7f" (UID: "149434b0-ace1-4e8f-9be4-76eb650f7c7f"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230526 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z894b\" (UniqueName: \"kubernetes.io/projected/149434b0-ace1-4e8f-9be4-76eb650f7c7f-kube-api-access-z894b\") pod \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230637 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-log\") pod \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230689 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/149434b0-ace1-4e8f-9be4-76eb650f7c7f-scripts\") pod \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230747 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-lib\") pod \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230768 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-run\") pod \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\" (UID: \"149434b0-ace1-4e8f-9be4-76eb650f7c7f\") " Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230770 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-log" (OuterVolumeSpecName: "var-log") pod "149434b0-ace1-4e8f-9be4-76eb650f7c7f" (UID: "149434b0-ace1-4e8f-9be4-76eb650f7c7f"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230849 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-lib" (OuterVolumeSpecName: "var-lib") pod "149434b0-ace1-4e8f-9be4-76eb650f7c7f" (UID: "149434b0-ace1-4e8f-9be4-76eb650f7c7f"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.230912 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-run" (OuterVolumeSpecName: "var-run") pod "149434b0-ace1-4e8f-9be4-76eb650f7c7f" (UID: "149434b0-ace1-4e8f-9be4-76eb650f7c7f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.231311 4766 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-lib\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.231328 4766 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.231337 4766 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.231347 4766 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/149434b0-ace1-4e8f-9be4-76eb650f7c7f-var-log\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.231950 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149434b0-ace1-4e8f-9be4-76eb650f7c7f-scripts" (OuterVolumeSpecName: "scripts") pod "149434b0-ace1-4e8f-9be4-76eb650f7c7f" (UID: "149434b0-ace1-4e8f-9be4-76eb650f7c7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.242972 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149434b0-ace1-4e8f-9be4-76eb650f7c7f-kube-api-access-z894b" (OuterVolumeSpecName: "kube-api-access-z894b") pod "149434b0-ace1-4e8f-9be4-76eb650f7c7f" (UID: "149434b0-ace1-4e8f-9be4-76eb650f7c7f"). InnerVolumeSpecName "kube-api-access-z894b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.332485 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z894b\" (UniqueName: \"kubernetes.io/projected/149434b0-ace1-4e8f-9be4-76eb650f7c7f-kube-api-access-z894b\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.332531 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/149434b0-ace1-4e8f-9be4-76eb650f7c7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.829030 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-h9kbs_149434b0-ace1-4e8f-9be4-76eb650f7c7f/ovs-vswitchd/0.log" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.832047 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-h9kbs" event={"ID":"149434b0-ace1-4e8f-9be4-76eb650f7c7f","Type":"ContainerDied","Data":"a788d5ee525d091bda823d25451d7e2fd0e5db2d8e284abdcb3d065876a6baf2"} Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.832127 4766 scope.go:117] "RemoveContainer" containerID="1759a3f9d91ac7492fec7596d908e1d35e1510395421ce2277052b78680c3b48" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.832204 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-h9kbs" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.840947 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:38:42 crc kubenswrapper[4766]: E1209 03:38:42.841295 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.860365 4766 scope.go:117] "RemoveContainer" containerID="d3980c79fe60136b2d0b79c0961b6b835ca3adedf6bebd93a9fd4ea7a14501a1" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.883465 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-h9kbs"] Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.886663 4766 scope.go:117] "RemoveContainer" containerID="a0903176e01683b3ccdb86281a0e088025dfb50a20e870a2e7c0105d7f97b042" Dec 09 03:38:42 crc kubenswrapper[4766]: I1209 03:38:42.889671 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-h9kbs"] Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.683654 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.731280 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752164 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-cache\") pod \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752208 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") pod \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752327 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3cf59e92-46b9-4693-b9ec-1a669b0e3897-etc-machine-id\") pod \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752384 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-scripts\") pod \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752410 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-lock\") pod \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752432 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbl5z\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-kube-api-access-zbl5z\") pod \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752454 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\" (UID: \"a13b4958-6576-4cdb-8237-7e8bedeef9fc\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752480 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data-custom\") pod \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752510 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-combined-ca-bundle\") pod \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752536 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfpl4\" (UniqueName: \"kubernetes.io/projected/3cf59e92-46b9-4693-b9ec-1a669b0e3897-kube-api-access-qfpl4\") pod \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.752574 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data\") pod \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\" (UID: \"3cf59e92-46b9-4693-b9ec-1a669b0e3897\") " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.753201 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-lock" (OuterVolumeSpecName: "lock") pod "a13b4958-6576-4cdb-8237-7e8bedeef9fc" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.753673 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-cache" (OuterVolumeSpecName: "cache") pod "a13b4958-6576-4cdb-8237-7e8bedeef9fc" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.754109 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cf59e92-46b9-4693-b9ec-1a669b0e3897-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3cf59e92-46b9-4693-b9ec-1a669b0e3897" (UID: "3cf59e92-46b9-4693-b9ec-1a669b0e3897"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.758753 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-kube-api-access-zbl5z" (OuterVolumeSpecName: "kube-api-access-zbl5z") pod "a13b4958-6576-4cdb-8237-7e8bedeef9fc" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc"). InnerVolumeSpecName "kube-api-access-zbl5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.758791 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a13b4958-6576-4cdb-8237-7e8bedeef9fc" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.759403 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3cf59e92-46b9-4693-b9ec-1a669b0e3897" (UID: "3cf59e92-46b9-4693-b9ec-1a669b0e3897"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.759785 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "a13b4958-6576-4cdb-8237-7e8bedeef9fc" (UID: "a13b4958-6576-4cdb-8237-7e8bedeef9fc"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.760078 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-scripts" (OuterVolumeSpecName: "scripts") pod "3cf59e92-46b9-4693-b9ec-1a669b0e3897" (UID: "3cf59e92-46b9-4693-b9ec-1a669b0e3897"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.761713 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf59e92-46b9-4693-b9ec-1a669b0e3897-kube-api-access-qfpl4" (OuterVolumeSpecName: "kube-api-access-qfpl4") pod "3cf59e92-46b9-4693-b9ec-1a669b0e3897" (UID: "3cf59e92-46b9-4693-b9ec-1a669b0e3897"). InnerVolumeSpecName "kube-api-access-qfpl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.794366 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cf59e92-46b9-4693-b9ec-1a669b0e3897" (UID: "3cf59e92-46b9-4693-b9ec-1a669b0e3897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.821585 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data" (OuterVolumeSpecName: "config-data") pod "3cf59e92-46b9-4693-b9ec-1a669b0e3897" (UID: "3cf59e92-46b9-4693-b9ec-1a669b0e3897"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.845965 4766 generic.go:334] "Generic (PLEG): container finished" podID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerID="d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493" exitCode=137 Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.846112 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.846333 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3cf59e92-46b9-4693-b9ec-1a669b0e3897","Type":"ContainerDied","Data":"d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493"} Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.846408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3cf59e92-46b9-4693-b9ec-1a669b0e3897","Type":"ContainerDied","Data":"304ea017233a8d3ddf1b180def120dd60b1676c51d56e1a8e56b56306b7af19d"} Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.846437 4766 scope.go:117] "RemoveContainer" containerID="fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853560 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3cf59e92-46b9-4693-b9ec-1a669b0e3897-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853586 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853596 4766 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-lock\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853605 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbl5z\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-kube-api-access-zbl5z\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853627 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853638 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853647 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853656 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfpl4\" (UniqueName: \"kubernetes.io/projected/3cf59e92-46b9-4693-b9ec-1a669b0e3897-kube-api-access-qfpl4\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853666 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf59e92-46b9-4693-b9ec-1a669b0e3897-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853675 4766 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a13b4958-6576-4cdb-8237-7e8bedeef9fc-cache\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.853683 4766 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a13b4958-6576-4cdb-8237-7e8bedeef9fc-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.858810 4766 generic.go:334] "Generic (PLEG): container finished" podID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerID="3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2" exitCode=137 Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.858856 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2"} Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.858884 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a13b4958-6576-4cdb-8237-7e8bedeef9fc","Type":"ContainerDied","Data":"688fcfc4446b57ee29f3e81848e1d6362676296f66726b747133558fbc65c0c7"} Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.859089 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.871388 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.888370 4766 scope.go:117] "RemoveContainer" containerID="d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.889050 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.923107 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.926074 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.935570 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.954967 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.958768 4766 scope.go:117] "RemoveContainer" containerID="fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c" Dec 09 03:38:43 crc kubenswrapper[4766]: E1209 03:38:43.959309 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c\": container with ID starting with fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c not found: ID does not exist" containerID="fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.959390 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c"} err="failed to get container status \"fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c\": rpc error: code = NotFound desc = could not find container \"fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c\": container with ID starting with fa4ceea0f1192d4e73621fdc968589e3da4009817887282157a4bbe97fc1463c not found: ID does not exist" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.959423 4766 scope.go:117] "RemoveContainer" containerID="d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493" Dec 09 03:38:43 crc kubenswrapper[4766]: E1209 03:38:43.959847 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493\": container with ID starting with d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493 not found: ID does not exist" containerID="d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.959938 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493"} err="failed to get container status \"d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493\": rpc error: code = NotFound desc = could not find container \"d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493\": container with ID starting with d68137f6a3be895b2a5471b6805eb358385cf1eb54dbc83095d240feaafb6493 not found: ID does not exist" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.960010 4766 scope.go:117] "RemoveContainer" containerID="3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.980923 4766 scope.go:117] "RemoveContainer" containerID="3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f" Dec 09 03:38:43 crc kubenswrapper[4766]: I1209 03:38:43.997914 4766 scope.go:117] "RemoveContainer" containerID="e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.015353 4766 scope.go:117] "RemoveContainer" containerID="071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.035810 4766 scope.go:117] "RemoveContainer" containerID="0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.056550 4766 scope.go:117] "RemoveContainer" containerID="c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.074243 4766 scope.go:117] "RemoveContainer" containerID="bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.090340 4766 scope.go:117] "RemoveContainer" containerID="6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.120550 4766 scope.go:117] "RemoveContainer" containerID="1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.145060 4766 scope.go:117] "RemoveContainer" containerID="d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.176146 4766 scope.go:117] "RemoveContainer" containerID="c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.193653 4766 scope.go:117] "RemoveContainer" containerID="e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.216849 4766 scope.go:117] "RemoveContainer" containerID="6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.240384 4766 scope.go:117] "RemoveContainer" containerID="b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.265290 4766 scope.go:117] "RemoveContainer" containerID="ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.286793 4766 scope.go:117] "RemoveContainer" containerID="3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.287266 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2\": container with ID starting with 3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2 not found: ID does not exist" containerID="3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.287305 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2"} err="failed to get container status \"3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2\": rpc error: code = NotFound desc = could not find container \"3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2\": container with ID starting with 3b5044415c38167b32f64c9a804ffae903c228ea4690de8e3096a4c327dc82d2 not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.287393 4766 scope.go:117] "RemoveContainer" containerID="3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.287684 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f\": container with ID starting with 3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f not found: ID does not exist" containerID="3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.287711 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f"} err="failed to get container status \"3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f\": rpc error: code = NotFound desc = could not find container \"3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f\": container with ID starting with 3fda3099e8dadcdf805ffc0e7fabc8e5e1e120126160c79737bff63f1ebcfe5f not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.287728 4766 scope.go:117] "RemoveContainer" containerID="e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.288658 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74\": container with ID starting with e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74 not found: ID does not exist" containerID="e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.288709 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74"} err="failed to get container status \"e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74\": rpc error: code = NotFound desc = could not find container \"e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74\": container with ID starting with e6863895544863e7e87a063bc896fc8ab76e478314973caaf70b489d5795af74 not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.288729 4766 scope.go:117] "RemoveContainer" containerID="071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.288993 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c\": container with ID starting with 071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c not found: ID does not exist" containerID="071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.289015 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c"} err="failed to get container status \"071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c\": rpc error: code = NotFound desc = could not find container \"071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c\": container with ID starting with 071c54e47e37259a3d1e3e6e99a6d41b2d62428e7c60fd0e8b212de00d94e29c not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.289029 4766 scope.go:117] "RemoveContainer" containerID="0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.289318 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b\": container with ID starting with 0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b not found: ID does not exist" containerID="0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.289336 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b"} err="failed to get container status \"0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b\": rpc error: code = NotFound desc = could not find container \"0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b\": container with ID starting with 0330a7cdf721bd279e9c9f30edb3d54033ee2922cbb3092d4653dfc014008f2b not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.289348 4766 scope.go:117] "RemoveContainer" containerID="c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.289805 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527\": container with ID starting with c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527 not found: ID does not exist" containerID="c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.289840 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527"} err="failed to get container status \"c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527\": rpc error: code = NotFound desc = could not find container \"c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527\": container with ID starting with c868594e9227a26643b5e66ed6b94a20bf6d040d6b2c8fd845aa08d1afbe5527 not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.289863 4766 scope.go:117] "RemoveContainer" containerID="bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.290145 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea\": container with ID starting with bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea not found: ID does not exist" containerID="bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.290175 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea"} err="failed to get container status \"bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea\": rpc error: code = NotFound desc = could not find container \"bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea\": container with ID starting with bed3dd974efaa9c2fb0323082c9409dc659a9d7c26d80318f7a4ddfa0e376eea not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.290191 4766 scope.go:117] "RemoveContainer" containerID="6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.290401 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73\": container with ID starting with 6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73 not found: ID does not exist" containerID="6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.290421 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73"} err="failed to get container status \"6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73\": rpc error: code = NotFound desc = could not find container \"6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73\": container with ID starting with 6ab8af247580971acc96928cb1cf95b3f710fc6304d7efb44e2c5a66b2257b73 not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.290433 4766 scope.go:117] "RemoveContainer" containerID="1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.290740 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac\": container with ID starting with 1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac not found: ID does not exist" containerID="1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.290771 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac"} err="failed to get container status \"1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac\": rpc error: code = NotFound desc = could not find container \"1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac\": container with ID starting with 1988b87fbdd021d14c9a0226d7f3de2584bdf48151ea23511673d17fd918c9ac not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.290785 4766 scope.go:117] "RemoveContainer" containerID="d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.291027 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba\": container with ID starting with d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba not found: ID does not exist" containerID="d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.291051 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba"} err="failed to get container status \"d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba\": rpc error: code = NotFound desc = could not find container \"d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba\": container with ID starting with d70c0e094614325642fc23c93dd4598de38a8101bc97c53bf080f668d426d1ba not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.291066 4766 scope.go:117] "RemoveContainer" containerID="c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.291463 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7\": container with ID starting with c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7 not found: ID does not exist" containerID="c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.291483 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7"} err="failed to get container status \"c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7\": rpc error: code = NotFound desc = could not find container \"c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7\": container with ID starting with c1aa7aa885ca387e7cf824ddf748771ce320a8ee4af115d5ad51825495ccf9a7 not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.291495 4766 scope.go:117] "RemoveContainer" containerID="e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.291801 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225\": container with ID starting with e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225 not found: ID does not exist" containerID="e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.291822 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225"} err="failed to get container status \"e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225\": rpc error: code = NotFound desc = could not find container \"e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225\": container with ID starting with e11133ba7535b91ee19ae97e127fe82a9044eddd9300d45c18c1bfebc02f4225 not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.291834 4766 scope.go:117] "RemoveContainer" containerID="6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.292021 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f\": container with ID starting with 6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f not found: ID does not exist" containerID="6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.292038 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f"} err="failed to get container status \"6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f\": rpc error: code = NotFound desc = could not find container \"6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f\": container with ID starting with 6aa5b2026991d728ff3a992955b61fbdaf3709c13e79b22690e6161cb0a3e66f not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.292049 4766 scope.go:117] "RemoveContainer" containerID="b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.292415 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7\": container with ID starting with b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7 not found: ID does not exist" containerID="b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.292434 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7"} err="failed to get container status \"b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7\": rpc error: code = NotFound desc = could not find container \"b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7\": container with ID starting with b9f86af76e8de07ee6b1b52427cac451ad850115feed2f75226081c8bf54b3d7 not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.292445 4766 scope.go:117] "RemoveContainer" containerID="ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14" Dec 09 03:38:44 crc kubenswrapper[4766]: E1209 03:38:44.292681 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14\": container with ID starting with ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14 not found: ID does not exist" containerID="ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.292702 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14"} err="failed to get container status \"ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14\": rpc error: code = NotFound desc = could not find container \"ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14\": container with ID starting with ac575d3c70a5819055526f5ef9dbeabd8ff3cf17ece2524082b9cd7bdf3b7f14 not found: ID does not exist" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.849867 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" path="/var/lib/kubelet/pods/149434b0-ace1-4e8f-9be4-76eb650f7c7f/volumes" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.850913 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" path="/var/lib/kubelet/pods/3cf59e92-46b9-4693-b9ec-1a669b0e3897/volumes" Dec 09 03:38:44 crc kubenswrapper[4766]: I1209 03:38:44.851778 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" path="/var/lib/kubelet/pods/a13b4958-6576-4cdb-8237-7e8bedeef9fc/volumes" Dec 09 03:38:47 crc kubenswrapper[4766]: I1209 03:38:47.851990 4766 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poda57927d7-7099-4b87-99ee-77aa589cd09f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poda57927d7-7099-4b87-99ee-77aa589cd09f] : Timed out while waiting for systemd to remove kubepods-besteffort-poda57927d7_7099_4b87_99ee_77aa589cd09f.slice" Dec 09 03:38:47 crc kubenswrapper[4766]: I1209 03:38:47.853195 4766 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7ae0a18c-f118-45b5-8989-9ca3a49827ad"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7ae0a18c-f118-45b5-8989-9ca3a49827ad] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7ae0a18c_f118_45b5_8989_9ca3a49827ad.slice" Dec 09 03:38:47 crc kubenswrapper[4766]: I1209 03:38:47.858023 4766 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod04522868-a66d-44f8-a9bb-6f157f26653f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod04522868-a66d-44f8-a9bb-6f157f26653f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod04522868_a66d_44f8_a9bb_6f157f26653f.slice" Dec 09 03:38:47 crc kubenswrapper[4766]: E1209 03:38:47.858079 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod04522868-a66d-44f8-a9bb-6f157f26653f] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod04522868-a66d-44f8-a9bb-6f157f26653f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod04522868_a66d_44f8_a9bb_6f157f26653f.slice" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" Dec 09 03:38:47 crc kubenswrapper[4766]: I1209 03:38:47.904745 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b4dbbdb47-q4g54" Dec 09 03:38:47 crc kubenswrapper[4766]: I1209 03:38:47.934152 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7b4dbbdb47-q4g54"] Dec 09 03:38:47 crc kubenswrapper[4766]: I1209 03:38:47.945198 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7b4dbbdb47-q4g54"] Dec 09 03:38:48 crc kubenswrapper[4766]: E1209 03:38:48.826814 4766 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 09 03:38:48 crc kubenswrapper[4766]: E1209 03:38:48.826897 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts podName:8b67744f-1b24-4baf-b397-26cff83c2a4d nodeName:}" failed. No retries permitted until 2025-12-09 03:39:20.826877735 +0000 UTC m=+1642.536183171 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts") pod "novacell00dd1-account-delete-flgqj" (UID: "8b67744f-1b24-4baf-b397-26cff83c2a4d") : configmap "openstack-scripts" not found Dec 09 03:38:48 crc kubenswrapper[4766]: I1209 03:38:48.860532 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" path="/var/lib/kubelet/pods/04522868-a66d-44f8-a9bb-6f157f26653f/volumes" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.901529 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bv8lf"] Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.901874 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ada9c6-91af-4717-a157-29070bf61a6e" containerName="memcached" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.901888 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ada9c6-91af-4717-a157-29070bf61a6e" containerName="memcached" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.901900 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354d9984-d7b5-4540-a96e-a68a7bf1b667" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.901909 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="354d9984-d7b5-4540-a96e-a68a7bf1b667" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.901926 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417726b2-75fd-4efc-84ec-803533df86aa" containerName="placement-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.901934 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="417726b2-75fd-4efc-84ec-803533df86aa" containerName="placement-log" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.901946 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" containerName="init" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.901952 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" containerName="init" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.901969 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerName="neutron-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.901976 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerName="neutron-api" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.901984 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec41cb84-c47e-4199-ac5d-825bbf4f7023" containerName="nova-cell0-conductor-conductor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.901991 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec41cb84-c47e-4199-ac5d-825bbf4f7023" containerName="nova-cell0-conductor-conductor" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902001 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf108478-7651-4f37-b0e7-3a571774d030" containerName="ovsdbserver-sb" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902008 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf108478-7651-4f37-b0e7-3a571774d030" containerName="ovsdbserver-sb" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902019 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47770faa-9973-4d81-a630-8c344bcd7b94" containerName="cinder-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902027 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="47770faa-9973-4d81-a630-8c344bcd7b94" containerName="cinder-api" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902037 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902043 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-api" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902050 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902057 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-log" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902065 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" containerName="dnsmasq-dns" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902074 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" containerName="dnsmasq-dns" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902084 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902091 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-log" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902101 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerName="neutron-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902108 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerName="neutron-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902118 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902125 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902139 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="proxy-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902146 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="proxy-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902155 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server-init" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902162 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server-init" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902177 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-updater" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902185 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-updater" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902195 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f03bec-d533-450d-b79b-7f19dc436d94" containerName="keystone-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902202 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f03bec-d533-450d-b79b-7f19dc436d94" containerName="keystone-api" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902240 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="rsync" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902248 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="rsync" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902261 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-replicator" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902267 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-replicator" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902276 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902283 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-server" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902295 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28c984f-04eb-4398-af98-9e2c5e6afd13" containerName="ovn-controller" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902302 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28c984f-04eb-4398-af98-9e2c5e6afd13" containerName="ovn-controller" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902311 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" containerName="mysql-bootstrap" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902318 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" containerName="mysql-bootstrap" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902328 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57927d7-7099-4b87-99ee-77aa589cd09f" containerName="galera" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902336 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57927d7-7099-4b87-99ee-77aa589cd09f" containerName="galera" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902350 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="swift-recon-cron" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902357 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="swift-recon-cron" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902369 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f227ef39-ddef-411a-96b3-96871679cae1" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902376 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f227ef39-ddef-411a-96b3-96871679cae1" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902391 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48862672-08e2-4ac6-86a3-57d84bbc868d" containerName="setup-container" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902398 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="48862672-08e2-4ac6-86a3-57d84bbc868d" containerName="setup-container" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902410 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-updater" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902418 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-updater" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902429 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" containerName="rabbitmq" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902436 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" containerName="rabbitmq" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902450 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f2b76e-7443-46d3-a296-76196dcc28b7" containerName="kube-state-metrics" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902457 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f2b76e-7443-46d3-a296-76196dcc28b7" containerName="kube-state-metrics" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902465 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5be126-5890-4cef-aa82-3bdeef1918cd" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902472 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5be126-5890-4cef-aa82-3bdeef1918cd" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902481 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="ovn-northd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902488 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="ovn-northd" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902495 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerName="barbican-keystone-listener-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902502 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerName="barbican-keystone-listener-log" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902512 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57927d7-7099-4b87-99ee-77aa589cd09f" containerName="mysql-bootstrap" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902519 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57927d7-7099-4b87-99ee-77aa589cd09f" containerName="mysql-bootstrap" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902529 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902536 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902544 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dfb6314-1f18-4e71-947e-534dc1021381" containerName="nova-cell1-conductor-conductor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902552 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfb6314-1f18-4e71-947e-534dc1021381" containerName="nova-cell1-conductor-conductor" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902562 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417726b2-75fd-4efc-84ec-803533df86aa" containerName="placement-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902570 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="417726b2-75fd-4efc-84ec-803533df86aa" containerName="placement-api" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902579 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerName="cinder-scheduler" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902586 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerName="cinder-scheduler" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902598 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-expirer" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902605 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-expirer" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902618 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="507e513c-d987-4fba-8fc0-e5ceff892afe" containerName="nova-scheduler-scheduler" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902625 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="507e513c-d987-4fba-8fc0-e5ceff892afe" containerName="nova-scheduler-scheduler" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902637 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="sg-core" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902644 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="sg-core" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902656 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902665 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-server" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902679 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902686 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902695 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77345185-b7d7-46d4-9e72-251bac080f3a" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902702 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="77345185-b7d7-46d4-9e72-251bac080f3a" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902711 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-auditor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902718 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-auditor" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902731 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47770faa-9973-4d81-a630-8c344bcd7b94" containerName="cinder-api-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902738 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="47770faa-9973-4d81-a630-8c344bcd7b94" containerName="cinder-api-log" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902747 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="ceilometer-central-agent" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902754 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="ceilometer-central-agent" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902762 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="ceilometer-notification-agent" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902769 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="ceilometer-notification-agent" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902780 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-metadata" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902787 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-metadata" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902795 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902801 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902808 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" containerName="barbican-worker-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902815 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" containerName="barbican-worker-log" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902826 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerName="proxy-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902832 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerName="proxy-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902842 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902848 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-log" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902860 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902868 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902882 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-auditor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902889 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-auditor" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902902 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerName="ovsdbserver-nb" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902908 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerName="ovsdbserver-nb" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902921 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" containerName="barbican-worker" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902928 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" containerName="barbican-worker" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902941 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerName="barbican-keystone-listener" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902948 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerName="barbican-keystone-listener" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902958 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1554173f-b66c-43d5-a5e4-cd10a81f09d4" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902965 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1554173f-b66c-43d5-a5e4-cd10a81f09d4" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902974 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerName="probe" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902982 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerName="probe" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.902990 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.902997 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-log" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903007 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-reaper" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903014 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-reaper" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903025 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903032 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api-log" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903041 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerName="proxy-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903047 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerName="proxy-server" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903062 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" containerName="setup-container" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903069 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" containerName="setup-container" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903082 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903090 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903102 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-auditor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903109 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-auditor" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903122 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a764f29a-d427-43a5-833f-34b2064c122d" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903129 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a764f29a-d427-43a5-833f-34b2064c122d" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903140 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903147 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-server" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903159 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48862672-08e2-4ac6-86a3-57d84bbc868d" containerName="rabbitmq" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903166 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="48862672-08e2-4ac6-86a3-57d84bbc868d" containerName="rabbitmq" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903178 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf108478-7651-4f37-b0e7-3a571774d030" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903186 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf108478-7651-4f37-b0e7-3a571774d030" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903194 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" containerName="galera" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903201 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" containerName="galera" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903258 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-replicator" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903266 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-replicator" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903278 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903285 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903292 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-replicator" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903299 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-replicator" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903306 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d1796c-9c3d-444c-bda3-2a7525ac2650" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903312 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d1796c-9c3d-444c-bda3-2a7525ac2650" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: E1209 03:38:49.903324 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604402ee-2350-4cec-8a56-b5203c3287e8" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903332 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="604402ee-2350-4cec-8a56-b5203c3287e8" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903484 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="77345185-b7d7-46d4-9e72-251bac080f3a" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903493 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="sg-core" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903502 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f03bec-d533-450d-b79b-7f19dc436d94" containerName="keystone-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903511 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerName="barbican-keystone-listener" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903516 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="ceilometer-notification-agent" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903526 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="507e513c-d987-4fba-8fc0-e5ceff892afe" containerName="nova-scheduler-scheduler" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903536 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7443d5d9-873e-430e-bcad-a90f5d4ca9c6" containerName="dnsmasq-dns" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903543 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerName="proxy-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903552 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d1d344-fbf5-415d-952e-9ee50493a134" containerName="galera" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903562 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-metadata" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903573 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="354d9984-d7b5-4540-a96e-a68a7bf1b667" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903585 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-auditor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903592 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-expirer" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903607 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="swift-recon-cron" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903620 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-reaper" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903629 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903640 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903649 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903657 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerName="ovsdbserver-nb" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903666 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903672 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb7e0f1-e5a8-45ef-9cbf-e308e82968af" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903679 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5edf46d6-e570-425b-843d-d67f5adde599" containerName="barbican-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903684 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovs-vswitchd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903692 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ada9c6-91af-4717-a157-29070bf61a6e" containerName="memcached" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903702 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf108478-7651-4f37-b0e7-3a571774d030" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903709 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerName="neutron-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903717 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-replicator" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903722 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerName="probe" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903731 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="47770faa-9973-4d81-a630-8c344bcd7b94" containerName="cinder-api-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903741 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903749 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="48862672-08e2-4ac6-86a3-57d84bbc868d" containerName="rabbitmq" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903758 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="ceilometer-central-agent" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903768 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-updater" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903775 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="417726b2-75fd-4efc-84ec-803533df86aa" containerName="placement-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903782 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf59e92-46b9-4693-b9ec-1a669b0e3897" containerName="cinder-scheduler" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903787 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="604402ee-2350-4cec-8a56-b5203c3287e8" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903797 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f227ef39-ddef-411a-96b3-96871679cae1" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903803 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903808 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-replicator" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903818 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5be126-5890-4cef-aa82-3bdeef1918cd" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903828 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57927d7-7099-4b87-99ee-77aa589cd09f" containerName="galera" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903836 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dfb6314-1f18-4e71-947e-534dc1021381" containerName="nova-cell1-conductor-conductor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903848 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" containerName="barbican-worker" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903859 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="149434b0-ace1-4e8f-9be4-76eb650f7c7f" containerName="ovsdb-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903866 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a764f29a-d427-43a5-833f-34b2064c122d" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903874 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf108478-7651-4f37-b0e7-3a571774d030" containerName="ovsdbserver-sb" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903882 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af438c1-d0b9-4ecb-bb88-a0efd14736a4" containerName="rabbitmq" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903892 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="ovn-northd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903902 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903915 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece85ffc-6754-4f25-a66c-cf66043196b3" containerName="nova-metadata-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903923 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="be23a05e-591f-4bdf-9c5f-8ee930181397" containerName="neutron-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903935 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99d3047-bb16-4bbe-a77d-0f4199121e7d" containerName="nova-api-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903945 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="417726b2-75fd-4efc-84ec-803533df86aa" containerName="placement-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903954 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f2b76e-7443-46d3-a296-76196dcc28b7" containerName="kube-state-metrics" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903961 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1554173f-b66c-43d5-a5e4-cd10a81f09d4" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903971 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-replicator" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903978 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="object-auditor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903987 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae0a18c-f118-45b5-8989-9ca3a49827ad" containerName="barbican-keystone-listener-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.903998 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="account-auditor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904009 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff493c4-bb15-4a40-9499-ca23bf79f42b" containerName="proxy-server" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904018 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d1796c-9c3d-444c-bda3-2a7525ac2650" containerName="mariadb-account-delete" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904025 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ff249a-854d-4c30-8216-b7bd9482e08c" containerName="openstack-network-exporter" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904060 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="04522868-a66d-44f8-a9bb-6f157f26653f" containerName="barbican-worker-log" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904069 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9edd6e7b-9841-43be-9478-5e7d06d8bd8d" containerName="glance-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904078 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e4f191-0150-4bdb-9afa-2cc5164c6b55" containerName="proxy-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904087 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec41cb84-c47e-4199-ac5d-825bbf4f7023" containerName="nova-cell0-conductor-conductor" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904095 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a00c8b-af47-4254-83de-a93a975b3afe" containerName="glance-httpd" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904104 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28c984f-04eb-4398-af98-9e2c5e6afd13" containerName="ovn-controller" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904112 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="rsync" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904121 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4958-6576-4cdb-8237-7e8bedeef9fc" containerName="container-updater" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.904128 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="47770faa-9973-4d81-a630-8c344bcd7b94" containerName="cinder-api" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.905177 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.913327 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bv8lf"] Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.942308 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-utilities\") pod \"community-operators-bv8lf\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.942570 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfr88\" (UniqueName: \"kubernetes.io/projected/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-kube-api-access-jfr88\") pod \"community-operators-bv8lf\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:49 crc kubenswrapper[4766]: I1209 03:38:49.942737 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-catalog-content\") pod \"community-operators-bv8lf\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.044800 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-utilities\") pod \"community-operators-bv8lf\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.044939 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfr88\" (UniqueName: \"kubernetes.io/projected/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-kube-api-access-jfr88\") pod \"community-operators-bv8lf\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.045380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-catalog-content\") pod \"community-operators-bv8lf\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.045739 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-utilities\") pod \"community-operators-bv8lf\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.045768 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-catalog-content\") pod \"community-operators-bv8lf\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.067103 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfr88\" (UniqueName: \"kubernetes.io/projected/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-kube-api-access-jfr88\") pod \"community-operators-bv8lf\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.229329 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.694579 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bv8lf"] Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.944094 4766 generic.go:334] "Generic (PLEG): container finished" podID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerID="fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a" exitCode=0 Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.944247 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8lf" event={"ID":"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879","Type":"ContainerDied","Data":"fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a"} Dec 09 03:38:50 crc kubenswrapper[4766]: I1209 03:38:50.945391 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8lf" event={"ID":"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879","Type":"ContainerStarted","Data":"a894e1dffd724e13270f3cf8cd0f24f479aa98d5544e6ca8ae26998d4d3d32e2"} Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.954760 4766 generic.go:334] "Generic (PLEG): container finished" podID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerID="cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17" exitCode=0 Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.954837 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8lf" event={"ID":"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879","Type":"ContainerDied","Data":"cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17"} Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.964691 4766 generic.go:334] "Generic (PLEG): container finished" podID="8b67744f-1b24-4baf-b397-26cff83c2a4d" containerID="0676e56c92cfee8a1d0d596375582bc5791b8e314503bd6233f9da366126d99f" exitCode=137 Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.964733 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00dd1-account-delete-flgqj" event={"ID":"8b67744f-1b24-4baf-b397-26cff83c2a4d","Type":"ContainerDied","Data":"0676e56c92cfee8a1d0d596375582bc5791b8e314503bd6233f9da366126d99f"} Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.964763 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00dd1-account-delete-flgqj" event={"ID":"8b67744f-1b24-4baf-b397-26cff83c2a4d","Type":"ContainerDied","Data":"9dac53f58fd3a7c0ee08d6358faf161329b52a6ebc57d4fa08e84a3534b2440c"} Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.964776 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dac53f58fd3a7c0ee08d6358faf161329b52a6ebc57d4fa08e84a3534b2440c" Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.964794 4766 scope.go:117] "RemoveContainer" containerID="b964f8ee7b5a62d95ec8c21e2f2cb9480262bdc47471ecb02c71bcf732440066" Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.965574 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.973977 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts\") pod \"8b67744f-1b24-4baf-b397-26cff83c2a4d\" (UID: \"8b67744f-1b24-4baf-b397-26cff83c2a4d\") " Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.974059 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v95q\" (UniqueName: \"kubernetes.io/projected/8b67744f-1b24-4baf-b397-26cff83c2a4d-kube-api-access-2v95q\") pod \"8b67744f-1b24-4baf-b397-26cff83c2a4d\" (UID: \"8b67744f-1b24-4baf-b397-26cff83c2a4d\") " Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.979368 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b67744f-1b24-4baf-b397-26cff83c2a4d" (UID: "8b67744f-1b24-4baf-b397-26cff83c2a4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:38:51 crc kubenswrapper[4766]: I1209 03:38:51.981987 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b67744f-1b24-4baf-b397-26cff83c2a4d-kube-api-access-2v95q" (OuterVolumeSpecName: "kube-api-access-2v95q") pod "8b67744f-1b24-4baf-b397-26cff83c2a4d" (UID: "8b67744f-1b24-4baf-b397-26cff83c2a4d"). InnerVolumeSpecName "kube-api-access-2v95q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:38:52 crc kubenswrapper[4766]: I1209 03:38:52.076059 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b67744f-1b24-4baf-b397-26cff83c2a4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:52 crc kubenswrapper[4766]: I1209 03:38:52.076417 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v95q\" (UniqueName: \"kubernetes.io/projected/8b67744f-1b24-4baf-b397-26cff83c2a4d-kube-api-access-2v95q\") on node \"crc\" DevicePath \"\"" Dec 09 03:38:52 crc kubenswrapper[4766]: I1209 03:38:52.975664 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell00dd1-account-delete-flgqj" Dec 09 03:38:52 crc kubenswrapper[4766]: I1209 03:38:52.978724 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8lf" event={"ID":"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879","Type":"ContainerStarted","Data":"e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab"} Dec 09 03:38:53 crc kubenswrapper[4766]: I1209 03:38:53.029456 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell00dd1-account-delete-flgqj"] Dec 09 03:38:53 crc kubenswrapper[4766]: I1209 03:38:53.038742 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell00dd1-account-delete-flgqj"] Dec 09 03:38:53 crc kubenswrapper[4766]: I1209 03:38:53.041433 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bv8lf" podStartSLOduration=2.538993862 podStartE2EDuration="4.041361626s" podCreationTimestamp="2025-12-09 03:38:49 +0000 UTC" firstStartedPulling="2025-12-09 03:38:50.945452467 +0000 UTC m=+1612.654757893" lastFinishedPulling="2025-12-09 03:38:52.447820191 +0000 UTC m=+1614.157125657" observedRunningTime="2025-12-09 03:38:53.037203903 +0000 UTC m=+1614.746509339" watchObservedRunningTime="2025-12-09 03:38:53.041361626 +0000 UTC m=+1614.750667062" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.475388 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q9j55"] Dec 09 03:38:54 crc kubenswrapper[4766]: E1209 03:38:54.476145 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b67744f-1b24-4baf-b397-26cff83c2a4d" containerName="mariadb-account-delete" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.476168 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b67744f-1b24-4baf-b397-26cff83c2a4d" containerName="mariadb-account-delete" Dec 09 03:38:54 crc kubenswrapper[4766]: E1209 03:38:54.476196 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b67744f-1b24-4baf-b397-26cff83c2a4d" containerName="mariadb-account-delete" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.476206 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b67744f-1b24-4baf-b397-26cff83c2a4d" containerName="mariadb-account-delete" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.477167 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b67744f-1b24-4baf-b397-26cff83c2a4d" containerName="mariadb-account-delete" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.477215 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b67744f-1b24-4baf-b397-26cff83c2a4d" containerName="mariadb-account-delete" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.478803 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.482558 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9j55"] Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.508365 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-utilities\") pod \"certified-operators-q9j55\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.508500 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-catalog-content\") pod \"certified-operators-q9j55\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.508541 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjx6w\" (UniqueName: \"kubernetes.io/projected/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-kube-api-access-sjx6w\") pod \"certified-operators-q9j55\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.609858 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-catalog-content\") pod \"certified-operators-q9j55\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.610100 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjx6w\" (UniqueName: \"kubernetes.io/projected/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-kube-api-access-sjx6w\") pod \"certified-operators-q9j55\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.610557 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-utilities\") pod \"certified-operators-q9j55\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.610659 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-catalog-content\") pod \"certified-operators-q9j55\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.610981 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-utilities\") pod \"certified-operators-q9j55\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.642863 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjx6w\" (UniqueName: \"kubernetes.io/projected/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-kube-api-access-sjx6w\") pod \"certified-operators-q9j55\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.796774 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:38:54 crc kubenswrapper[4766]: I1209 03:38:54.855231 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b67744f-1b24-4baf-b397-26cff83c2a4d" path="/var/lib/kubelet/pods/8b67744f-1b24-4baf-b397-26cff83c2a4d/volumes" Dec 09 03:38:55 crc kubenswrapper[4766]: I1209 03:38:55.132441 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9j55"] Dec 09 03:38:56 crc kubenswrapper[4766]: I1209 03:38:56.008601 4766 generic.go:334] "Generic (PLEG): container finished" podID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerID="db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff" exitCode=0 Dec 09 03:38:56 crc kubenswrapper[4766]: I1209 03:38:56.008654 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9j55" event={"ID":"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f","Type":"ContainerDied","Data":"db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff"} Dec 09 03:38:56 crc kubenswrapper[4766]: I1209 03:38:56.008937 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9j55" event={"ID":"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f","Type":"ContainerStarted","Data":"e7290de63f60aa1d14d0b482ca5ca9dec9a88e056f46f2edc167e02b29aac849"} Dec 09 03:38:57 crc kubenswrapper[4766]: I1209 03:38:57.020733 4766 generic.go:334] "Generic (PLEG): container finished" podID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerID="bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b" exitCode=0 Dec 09 03:38:57 crc kubenswrapper[4766]: I1209 03:38:57.020783 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9j55" event={"ID":"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f","Type":"ContainerDied","Data":"bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b"} Dec 09 03:38:57 crc kubenswrapper[4766]: I1209 03:38:57.839022 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:38:57 crc kubenswrapper[4766]: E1209 03:38:57.839303 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:38:58 crc kubenswrapper[4766]: I1209 03:38:58.030457 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9j55" event={"ID":"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f","Type":"ContainerStarted","Data":"821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07"} Dec 09 03:38:58 crc kubenswrapper[4766]: I1209 03:38:58.049607 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q9j55" podStartSLOduration=2.341136256 podStartE2EDuration="4.049592334s" podCreationTimestamp="2025-12-09 03:38:54 +0000 UTC" firstStartedPulling="2025-12-09 03:38:56.010858656 +0000 UTC m=+1617.720164092" lastFinishedPulling="2025-12-09 03:38:57.719314704 +0000 UTC m=+1619.428620170" observedRunningTime="2025-12-09 03:38:58.048576796 +0000 UTC m=+1619.757882242" watchObservedRunningTime="2025-12-09 03:38:58.049592334 +0000 UTC m=+1619.758897750" Dec 09 03:39:00 crc kubenswrapper[4766]: I1209 03:39:00.231791 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:39:00 crc kubenswrapper[4766]: I1209 03:39:00.231878 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:39:00 crc kubenswrapper[4766]: I1209 03:39:00.311942 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:39:01 crc kubenswrapper[4766]: I1209 03:39:01.099829 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:39:01 crc kubenswrapper[4766]: I1209 03:39:01.469381 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bv8lf"] Dec 09 03:39:03 crc kubenswrapper[4766]: I1209 03:39:03.078142 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bv8lf" podUID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerName="registry-server" containerID="cri-o://e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab" gracePeriod=2 Dec 09 03:39:03 crc kubenswrapper[4766]: I1209 03:39:03.981004 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.087999 4766 generic.go:334] "Generic (PLEG): container finished" podID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerID="e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab" exitCode=0 Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.088048 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8lf" event={"ID":"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879","Type":"ContainerDied","Data":"e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab"} Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.088080 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv8lf" event={"ID":"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879","Type":"ContainerDied","Data":"a894e1dffd724e13270f3cf8cd0f24f479aa98d5544e6ca8ae26998d4d3d32e2"} Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.088100 4766 scope.go:117] "RemoveContainer" containerID="e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.088118 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv8lf" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.106124 4766 scope.go:117] "RemoveContainer" containerID="cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.140161 4766 scope.go:117] "RemoveContainer" containerID="fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.147109 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-utilities\") pod \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.147256 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfr88\" (UniqueName: \"kubernetes.io/projected/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-kube-api-access-jfr88\") pod \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.147305 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-catalog-content\") pod \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\" (UID: \"0bfc6ad1-cbb1-4c18-ae39-812ac20a6879\") " Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.148333 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-utilities" (OuterVolumeSpecName: "utilities") pod "0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" (UID: "0bfc6ad1-cbb1-4c18-ae39-812ac20a6879"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.153522 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-kube-api-access-jfr88" (OuterVolumeSpecName: "kube-api-access-jfr88") pod "0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" (UID: "0bfc6ad1-cbb1-4c18-ae39-812ac20a6879"). InnerVolumeSpecName "kube-api-access-jfr88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.164800 4766 scope.go:117] "RemoveContainer" containerID="e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab" Dec 09 03:39:04 crc kubenswrapper[4766]: E1209 03:39:04.165374 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab\": container with ID starting with e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab not found: ID does not exist" containerID="e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.165422 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab"} err="failed to get container status \"e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab\": rpc error: code = NotFound desc = could not find container \"e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab\": container with ID starting with e9d33a4d7ddb767708540cdbf02fc76dab476d5d7321b9448a61a835d39358ab not found: ID does not exist" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.165451 4766 scope.go:117] "RemoveContainer" containerID="cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17" Dec 09 03:39:04 crc kubenswrapper[4766]: E1209 03:39:04.165852 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17\": container with ID starting with cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17 not found: ID does not exist" containerID="cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.165886 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17"} err="failed to get container status \"cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17\": rpc error: code = NotFound desc = could not find container \"cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17\": container with ID starting with cdbe4c6574216dd5e3d16717ceffe3b26bfd8a4c1d9170d0bdb888ed98bf0d17 not found: ID does not exist" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.165908 4766 scope.go:117] "RemoveContainer" containerID="fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a" Dec 09 03:39:04 crc kubenswrapper[4766]: E1209 03:39:04.166161 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a\": container with ID starting with fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a not found: ID does not exist" containerID="fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.166187 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a"} err="failed to get container status \"fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a\": rpc error: code = NotFound desc = could not find container \"fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a\": container with ID starting with fb8cfaf333bc5c949f3ca68afdd710381f5e2d2cc1ae88d8792c4f880142127a not found: ID does not exist" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.198115 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" (UID: "0bfc6ad1-cbb1-4c18-ae39-812ac20a6879"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.248546 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.248590 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfr88\" (UniqueName: \"kubernetes.io/projected/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-kube-api-access-jfr88\") on node \"crc\" DevicePath \"\"" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.248599 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.428400 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bv8lf"] Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.435271 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bv8lf"] Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.797648 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.798001 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.850486 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" path="/var/lib/kubelet/pods/0bfc6ad1-cbb1-4c18-ae39-812ac20a6879/volumes" Dec 09 03:39:04 crc kubenswrapper[4766]: I1209 03:39:04.851643 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:39:05 crc kubenswrapper[4766]: I1209 03:39:05.141612 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:39:06 crc kubenswrapper[4766]: I1209 03:39:06.864989 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9j55"] Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.116140 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q9j55" podUID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerName="registry-server" containerID="cri-o://821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07" gracePeriod=2 Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.505353 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.615169 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-catalog-content\") pod \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.615351 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-utilities\") pod \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.615414 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjx6w\" (UniqueName: \"kubernetes.io/projected/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-kube-api-access-sjx6w\") pod \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\" (UID: \"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f\") " Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.616060 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-utilities" (OuterVolumeSpecName: "utilities") pod "e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" (UID: "e6e138fa-d9c1-41a6-80b2-f21dc0921f1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.620190 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-kube-api-access-sjx6w" (OuterVolumeSpecName: "kube-api-access-sjx6w") pod "e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" (UID: "e6e138fa-d9c1-41a6-80b2-f21dc0921f1f"). InnerVolumeSpecName "kube-api-access-sjx6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.683870 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" (UID: "e6e138fa-d9c1-41a6-80b2-f21dc0921f1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.717367 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.717405 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjx6w\" (UniqueName: \"kubernetes.io/projected/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-kube-api-access-sjx6w\") on node \"crc\" DevicePath \"\"" Dec 09 03:39:07 crc kubenswrapper[4766]: I1209 03:39:07.717419 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.128178 4766 generic.go:334] "Generic (PLEG): container finished" podID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerID="821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07" exitCode=0 Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.128241 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9j55" event={"ID":"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f","Type":"ContainerDied","Data":"821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07"} Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.128270 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9j55" event={"ID":"e6e138fa-d9c1-41a6-80b2-f21dc0921f1f","Type":"ContainerDied","Data":"e7290de63f60aa1d14d0b482ca5ca9dec9a88e056f46f2edc167e02b29aac849"} Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.128290 4766 scope.go:117] "RemoveContainer" containerID="821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.128422 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9j55" Dec 09 03:39:08 crc kubenswrapper[4766]: W1209 03:39:08.144021 4766 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6e138fa_d9c1_41a6_80b2_f21dc0921f1f.slice/memory.max": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6e138fa_d9c1_41a6_80b2_f21dc0921f1f.slice/memory.max: no such device Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.149856 4766 scope.go:117] "RemoveContainer" containerID="bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.160610 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9j55"] Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.166145 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q9j55"] Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.186117 4766 scope.go:117] "RemoveContainer" containerID="db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff" Dec 09 03:39:08 crc kubenswrapper[4766]: E1209 03:39:08.194485 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6e138fa_d9c1_41a6_80b2_f21dc0921f1f.slice\": RecentStats: unable to find data in memory cache]" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.202646 4766 scope.go:117] "RemoveContainer" containerID="821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07" Dec 09 03:39:08 crc kubenswrapper[4766]: E1209 03:39:08.202950 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07\": container with ID starting with 821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07 not found: ID does not exist" containerID="821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.202981 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07"} err="failed to get container status \"821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07\": rpc error: code = NotFound desc = could not find container \"821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07\": container with ID starting with 821167e78c2827d500c125e66fe7bd30cab094475372f40276d2b33dcc74af07 not found: ID does not exist" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.203000 4766 scope.go:117] "RemoveContainer" containerID="bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b" Dec 09 03:39:08 crc kubenswrapper[4766]: E1209 03:39:08.203308 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b\": container with ID starting with bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b not found: ID does not exist" containerID="bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.203332 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b"} err="failed to get container status \"bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b\": rpc error: code = NotFound desc = could not find container \"bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b\": container with ID starting with bf1ae2e12fecb7a418335ce86000db7c213024095ae80c844953c986b5a0115b not found: ID does not exist" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.203344 4766 scope.go:117] "RemoveContainer" containerID="db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff" Dec 09 03:39:08 crc kubenswrapper[4766]: E1209 03:39:08.203534 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff\": container with ID starting with db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff not found: ID does not exist" containerID="db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.203555 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff"} err="failed to get container status \"db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff\": rpc error: code = NotFound desc = could not find container \"db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff\": container with ID starting with db54ca28957985434d6c4653c71577625d0cd7174be496f4fb568011331490ff not found: ID does not exist" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.844327 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:39:08 crc kubenswrapper[4766]: E1209 03:39:08.844597 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:39:08 crc kubenswrapper[4766]: I1209 03:39:08.849486 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" path="/var/lib/kubelet/pods/e6e138fa-d9c1-41a6-80b2-f21dc0921f1f/volumes" Dec 09 03:39:23 crc kubenswrapper[4766]: I1209 03:39:23.839289 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:39:23 crc kubenswrapper[4766]: E1209 03:39:23.840111 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:39:38 crc kubenswrapper[4766]: I1209 03:39:38.844700 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:39:38 crc kubenswrapper[4766]: E1209 03:39:38.845521 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:39:52 crc kubenswrapper[4766]: I1209 03:39:52.840635 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:39:52 crc kubenswrapper[4766]: E1209 03:39:52.841552 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:40:05 crc kubenswrapper[4766]: I1209 03:40:05.840088 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:40:05 crc kubenswrapper[4766]: E1209 03:40:05.840921 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:40:09 crc kubenswrapper[4766]: I1209 03:40:09.815990 4766 scope.go:117] "RemoveContainer" containerID="7e3d89ca9056493b6e6e8ac9b33bcad91da4a6aaccec7f99bacb242855bb6e95" Dec 09 03:40:09 crc kubenswrapper[4766]: I1209 03:40:09.849979 4766 scope.go:117] "RemoveContainer" containerID="0b4f54ea8d642e05e2a555795cff1dc74694cf421659e3f43efd44714ff34125" Dec 09 03:40:09 crc kubenswrapper[4766]: I1209 03:40:09.894969 4766 scope.go:117] "RemoveContainer" containerID="797200790ce6d3aaa2d627d061ab0d04ee01f885e7da6724a44b88b836f80f26" Dec 09 03:40:09 crc kubenswrapper[4766]: I1209 03:40:09.942598 4766 scope.go:117] "RemoveContainer" containerID="4f3f6e014efa4d6dfba0dea7aea4893fdb2153914420f7a8f8e5fd012360ef1f" Dec 09 03:40:09 crc kubenswrapper[4766]: I1209 03:40:09.968838 4766 scope.go:117] "RemoveContainer" containerID="2d3338e237ba7af89e198aa31cd1ddc3d0b533524dcf30eb8fa1547e1b318aaf" Dec 09 03:40:09 crc kubenswrapper[4766]: I1209 03:40:09.995983 4766 scope.go:117] "RemoveContainer" containerID="5e424dade2953e128931f7283ccbc4c6fecc4b4958d07c505678a5238649dfb3" Dec 09 03:40:10 crc kubenswrapper[4766]: I1209 03:40:10.022384 4766 scope.go:117] "RemoveContainer" containerID="c05a6f7c85a947116735fa7ade874110dcd927f792cdf1a18a3bb6036f564a48" Dec 09 03:40:10 crc kubenswrapper[4766]: I1209 03:40:10.049479 4766 scope.go:117] "RemoveContainer" containerID="df2580fc6a7c68c79a9fa23dde958abfe360c1d442a6396b5b94f67428f72619" Dec 09 03:40:10 crc kubenswrapper[4766]: I1209 03:40:10.066625 4766 scope.go:117] "RemoveContainer" containerID="703a57a4d0aff6c0b3a602487fcde7105c12acc5a93f46652d99af88e4758cbd" Dec 09 03:40:10 crc kubenswrapper[4766]: I1209 03:40:10.084091 4766 scope.go:117] "RemoveContainer" containerID="a5885e25c1b50574e74bf9082d1b22a74bc1bdd420d533adc98232798e6efdce" Dec 09 03:40:10 crc kubenswrapper[4766]: I1209 03:40:10.106697 4766 scope.go:117] "RemoveContainer" containerID="24729679fd0dcc450517c6906dbe60f8c99be4396dcda6ec1d41ff11e94c6452" Dec 09 03:40:10 crc kubenswrapper[4766]: I1209 03:40:10.126240 4766 scope.go:117] "RemoveContainer" containerID="d49afed99ab414e48e5466fbe45ed0135ebd1a69a605f4158a48d0568bf92d18" Dec 09 03:40:17 crc kubenswrapper[4766]: I1209 03:40:17.839964 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:40:17 crc kubenswrapper[4766]: E1209 03:40:17.840958 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:40:31 crc kubenswrapper[4766]: I1209 03:40:31.839665 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:40:31 crc kubenswrapper[4766]: E1209 03:40:31.840790 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:40:42 crc kubenswrapper[4766]: I1209 03:40:42.840161 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:40:42 crc kubenswrapper[4766]: E1209 03:40:42.842447 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:40:56 crc kubenswrapper[4766]: I1209 03:40:56.839707 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:40:56 crc kubenswrapper[4766]: E1209 03:40:56.840660 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.329873 4766 scope.go:117] "RemoveContainer" containerID="37efa13168c573c890903d3df4f307a2f1971d2d65054fb7d786360d3f24f242" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.364932 4766 scope.go:117] "RemoveContainer" containerID="0c25d2af5e2e7351b243503284d8e691c6a6faa53b211e11267bea67a92b5a4d" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.421815 4766 scope.go:117] "RemoveContainer" containerID="99952e0ef23e63c630103cbcb6fa7f6bfbf1f3297c7069ddbb9d529ac929de90" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.461290 4766 scope.go:117] "RemoveContainer" containerID="b06965f1ab86e1115e4f9ff8af07336b6f7146502c5e6422ba2fb8b5d1d20e96" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.477905 4766 scope.go:117] "RemoveContainer" containerID="ea2d873adb8b8f0bf3c7da1db055698c2ecd4add0b416c79010a80d08ed56eb9" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.519770 4766 scope.go:117] "RemoveContainer" containerID="aeb92723757cb91dd1871bac1a2f9acfdedb227f6bd6772c6fe5cd41ca358b51" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.541633 4766 scope.go:117] "RemoveContainer" containerID="32ecfe87a5271d6fd5dce2e9ea519ef68986440e73fe41076aca0515484e504d" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.588081 4766 scope.go:117] "RemoveContainer" containerID="2abec2b90461646367e56150d6bfc41963a577b60de43c6d0bc97c3b8d56a479" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.611343 4766 scope.go:117] "RemoveContainer" containerID="d10a4a20cfee2935e2676d326b1df6fbc314a11ef1613497c99fe595048642e5" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.660090 4766 scope.go:117] "RemoveContainer" containerID="b0a393cd167a7adea99e8e7fe9ff0ab043ea9c4d159ff79b0dade14d2474032d" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.690761 4766 scope.go:117] "RemoveContainer" containerID="8395f4aaedfdb13ba81601d8b8bed90b4f2057ed88097de8b79f3aa583caf210" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.709042 4766 scope.go:117] "RemoveContainer" containerID="72245693cd64770006d3527802bc4f15e68d9c06f27cb3116690cee7671021ae" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.736328 4766 scope.go:117] "RemoveContainer" containerID="1ea189ed90e2f9b071a383669d02f3c49ad21027bdd943cae8002f36da104948" Dec 09 03:41:10 crc kubenswrapper[4766]: I1209 03:41:10.839024 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:41:10 crc kubenswrapper[4766]: E1209 03:41:10.839320 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:41:21 crc kubenswrapper[4766]: I1209 03:41:21.839042 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:41:21 crc kubenswrapper[4766]: E1209 03:41:21.839872 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:41:33 crc kubenswrapper[4766]: I1209 03:41:33.839481 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:41:33 crc kubenswrapper[4766]: E1209 03:41:33.840186 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:41:45 crc kubenswrapper[4766]: I1209 03:41:45.839375 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:41:45 crc kubenswrapper[4766]: E1209 03:41:45.840396 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:42:00 crc kubenswrapper[4766]: I1209 03:42:00.839601 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:42:00 crc kubenswrapper[4766]: E1209 03:42:00.840448 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:42:10 crc kubenswrapper[4766]: I1209 03:42:10.924604 4766 scope.go:117] "RemoveContainer" containerID="d9fab7dfeb3826cd0321cfd7d0feacdb6d86c47207cf4911ed9ec23c6780f0a1" Dec 09 03:42:10 crc kubenswrapper[4766]: I1209 03:42:10.986465 4766 scope.go:117] "RemoveContainer" containerID="1d46ce8d257b80d4a4e27654d912e2d51bed7b3e943ae0d9d0afac1cc916f758" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.014577 4766 scope.go:117] "RemoveContainer" containerID="31d4aef01e5d6494a763f4846694b59d4c729b3f2ea71927992000f13a953678" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.040095 4766 scope.go:117] "RemoveContainer" containerID="e25b6d52f84a5db4c92a5ad1a166d1f5cf47e474cd768f741c58bbd00f159311" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.073312 4766 scope.go:117] "RemoveContainer" containerID="3db5b2f3ce4dc714b8a30ad8ec2e3babbb0b659bc2980b720781766e884a614d" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.098839 4766 scope.go:117] "RemoveContainer" containerID="aa1b6bc2d208e72f0a55decbffc820ca0e6a6e6ddd531379c4f4b69be452c224" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.121780 4766 scope.go:117] "RemoveContainer" containerID="6a9c046486c1c5dab899d5d0aba1033437d6e7306b2ba8194f3216464695d8cc" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.140409 4766 scope.go:117] "RemoveContainer" containerID="fba4f4072456bf794d0f60bf93f26a45e302fb3f1e1abcf3524f28c79c148914" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.169569 4766 scope.go:117] "RemoveContainer" containerID="6344e7369a853e77c3393dd6d7339e8c504e823cd554068d45409f77989aba7d" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.198698 4766 scope.go:117] "RemoveContainer" containerID="1e19e8939ccf2b634db5afdb6bea4bb55aaae63d314fa98c05a6488b6ad75ea8" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.219490 4766 scope.go:117] "RemoveContainer" containerID="449d2dddaa0f97deefa2998a70a2009afc79446167e01a0e225a949bcd70b905" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.239994 4766 scope.go:117] "RemoveContainer" containerID="2e75fee47cbaa033a97f9c0a2fead9cf5720b793210e021d133929ac9e7853c5" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.256171 4766 scope.go:117] "RemoveContainer" containerID="1f7f05c852cc98494db502851dafc9d9e8f7eb88e225048ac88ebdb63d25b528" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.275291 4766 scope.go:117] "RemoveContainer" containerID="949ff15f71d458a94f6dfdcf0fdf4392ce78c50c4a716dd4dcce6614bb75c2ca" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.296800 4766 scope.go:117] "RemoveContainer" containerID="a76bb5c8ee791c900387340a2d3fc6fcb013d905d053c5740dcf188f5500cce9" Dec 09 03:42:11 crc kubenswrapper[4766]: I1209 03:42:11.839350 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:42:12 crc kubenswrapper[4766]: I1209 03:42:12.915465 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"1b8ea9e58118db65df0a143bc4e6610e1e42d0f82ff9e57737d9c5cdb61f8871"} Dec 09 03:43:11 crc kubenswrapper[4766]: I1209 03:43:11.520272 4766 scope.go:117] "RemoveContainer" containerID="ca2afbaa30f2f394604235f5720c5aa4e72406b67d1ca341aaee0ebf94179748" Dec 09 03:43:11 crc kubenswrapper[4766]: I1209 03:43:11.558578 4766 scope.go:117] "RemoveContainer" containerID="ce107d8559532857d9f216c112581bb9f83cc5d523eef80c33104385461d818c" Dec 09 03:43:11 crc kubenswrapper[4766]: I1209 03:43:11.596887 4766 scope.go:117] "RemoveContainer" containerID="c195a4bf15fe20f9423e97eb1ce60f9483c3da4af5de80348bebc92f5f6abde2" Dec 09 03:43:11 crc kubenswrapper[4766]: I1209 03:43:11.646661 4766 scope.go:117] "RemoveContainer" containerID="ce3ccf75f0cf375ecd11a04f0e8f5e4eaab059c87c848791d6110f0f4ae70604" Dec 09 03:43:11 crc kubenswrapper[4766]: I1209 03:43:11.670253 4766 scope.go:117] "RemoveContainer" containerID="95f400ae3bc5fed8d76ed756e9cc0580f2a32e514a76972f4091c1cce5e175e5" Dec 09 03:44:11 crc kubenswrapper[4766]: I1209 03:44:11.800854 4766 scope.go:117] "RemoveContainer" containerID="63bf0756d28bad553080eb2bde20d0a5a0b4bb3782133960bfc5df7f50fdbbea" Dec 09 03:44:37 crc kubenswrapper[4766]: I1209 03:44:37.316823 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:44:37 crc kubenswrapper[4766]: I1209 03:44:37.317670 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.145850 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v"] Dec 09 03:45:00 crc kubenswrapper[4766]: E1209 03:45:00.146722 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerName="extract-utilities" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.146740 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerName="extract-utilities" Dec 09 03:45:00 crc kubenswrapper[4766]: E1209 03:45:00.146757 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerName="extract-content" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.146764 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerName="extract-content" Dec 09 03:45:00 crc kubenswrapper[4766]: E1209 03:45:00.146788 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerName="extract-content" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.146794 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerName="extract-content" Dec 09 03:45:00 crc kubenswrapper[4766]: E1209 03:45:00.146811 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerName="registry-server" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.146823 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerName="registry-server" Dec 09 03:45:00 crc kubenswrapper[4766]: E1209 03:45:00.146832 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerName="registry-server" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.146839 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerName="registry-server" Dec 09 03:45:00 crc kubenswrapper[4766]: E1209 03:45:00.146852 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerName="extract-utilities" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.146859 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerName="extract-utilities" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.147037 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bfc6ad1-cbb1-4c18-ae39-812ac20a6879" containerName="registry-server" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.147062 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e138fa-d9c1-41a6-80b2-f21dc0921f1f" containerName="registry-server" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.147742 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.151240 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.151496 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.159259 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v"] Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.230043 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/423a5f21-244b-4266-8d32-29468a4bf6f3-config-volume\") pod \"collect-profiles-29420865-7959v\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.230102 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/423a5f21-244b-4266-8d32-29468a4bf6f3-secret-volume\") pod \"collect-profiles-29420865-7959v\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.230291 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hzz7\" (UniqueName: \"kubernetes.io/projected/423a5f21-244b-4266-8d32-29468a4bf6f3-kube-api-access-4hzz7\") pod \"collect-profiles-29420865-7959v\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.332010 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hzz7\" (UniqueName: \"kubernetes.io/projected/423a5f21-244b-4266-8d32-29468a4bf6f3-kube-api-access-4hzz7\") pod \"collect-profiles-29420865-7959v\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.332125 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/423a5f21-244b-4266-8d32-29468a4bf6f3-config-volume\") pod \"collect-profiles-29420865-7959v\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.332179 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/423a5f21-244b-4266-8d32-29468a4bf6f3-secret-volume\") pod \"collect-profiles-29420865-7959v\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.333353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/423a5f21-244b-4266-8d32-29468a4bf6f3-config-volume\") pod \"collect-profiles-29420865-7959v\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.337841 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/423a5f21-244b-4266-8d32-29468a4bf6f3-secret-volume\") pod \"collect-profiles-29420865-7959v\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.353298 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hzz7\" (UniqueName: \"kubernetes.io/projected/423a5f21-244b-4266-8d32-29468a4bf6f3-kube-api-access-4hzz7\") pod \"collect-profiles-29420865-7959v\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.472202 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:00 crc kubenswrapper[4766]: I1209 03:45:00.918418 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v"] Dec 09 03:45:00 crc kubenswrapper[4766]: W1209 03:45:00.927587 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod423a5f21_244b_4266_8d32_29468a4bf6f3.slice/crio-7f81aaaa5c6967ee1d8e8c28d1f7e93cf07f55ec2d0575dd817b64d27364bd24 WatchSource:0}: Error finding container 7f81aaaa5c6967ee1d8e8c28d1f7e93cf07f55ec2d0575dd817b64d27364bd24: Status 404 returned error can't find the container with id 7f81aaaa5c6967ee1d8e8c28d1f7e93cf07f55ec2d0575dd817b64d27364bd24 Dec 09 03:45:01 crc kubenswrapper[4766]: I1209 03:45:01.415492 4766 generic.go:334] "Generic (PLEG): container finished" podID="423a5f21-244b-4266-8d32-29468a4bf6f3" containerID="0000d767f3cf311aa03fa518b2584004cedcba666a2aaf2b35eb2ba28afeabe6" exitCode=0 Dec 09 03:45:01 crc kubenswrapper[4766]: I1209 03:45:01.415549 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" event={"ID":"423a5f21-244b-4266-8d32-29468a4bf6f3","Type":"ContainerDied","Data":"0000d767f3cf311aa03fa518b2584004cedcba666a2aaf2b35eb2ba28afeabe6"} Dec 09 03:45:01 crc kubenswrapper[4766]: I1209 03:45:01.415580 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" event={"ID":"423a5f21-244b-4266-8d32-29468a4bf6f3","Type":"ContainerStarted","Data":"7f81aaaa5c6967ee1d8e8c28d1f7e93cf07f55ec2d0575dd817b64d27364bd24"} Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.713526 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.784962 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hzz7\" (UniqueName: \"kubernetes.io/projected/423a5f21-244b-4266-8d32-29468a4bf6f3-kube-api-access-4hzz7\") pod \"423a5f21-244b-4266-8d32-29468a4bf6f3\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.785049 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/423a5f21-244b-4266-8d32-29468a4bf6f3-config-volume\") pod \"423a5f21-244b-4266-8d32-29468a4bf6f3\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.785079 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/423a5f21-244b-4266-8d32-29468a4bf6f3-secret-volume\") pod \"423a5f21-244b-4266-8d32-29468a4bf6f3\" (UID: \"423a5f21-244b-4266-8d32-29468a4bf6f3\") " Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.785843 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/423a5f21-244b-4266-8d32-29468a4bf6f3-config-volume" (OuterVolumeSpecName: "config-volume") pod "423a5f21-244b-4266-8d32-29468a4bf6f3" (UID: "423a5f21-244b-4266-8d32-29468a4bf6f3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.792041 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423a5f21-244b-4266-8d32-29468a4bf6f3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "423a5f21-244b-4266-8d32-29468a4bf6f3" (UID: "423a5f21-244b-4266-8d32-29468a4bf6f3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.792939 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423a5f21-244b-4266-8d32-29468a4bf6f3-kube-api-access-4hzz7" (OuterVolumeSpecName: "kube-api-access-4hzz7") pod "423a5f21-244b-4266-8d32-29468a4bf6f3" (UID: "423a5f21-244b-4266-8d32-29468a4bf6f3"). InnerVolumeSpecName "kube-api-access-4hzz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.886862 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hzz7\" (UniqueName: \"kubernetes.io/projected/423a5f21-244b-4266-8d32-29468a4bf6f3-kube-api-access-4hzz7\") on node \"crc\" DevicePath \"\"" Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.886895 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/423a5f21-244b-4266-8d32-29468a4bf6f3-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 03:45:02 crc kubenswrapper[4766]: I1209 03:45:02.886903 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/423a5f21-244b-4266-8d32-29468a4bf6f3-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 03:45:03 crc kubenswrapper[4766]: I1209 03:45:03.435915 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" event={"ID":"423a5f21-244b-4266-8d32-29468a4bf6f3","Type":"ContainerDied","Data":"7f81aaaa5c6967ee1d8e8c28d1f7e93cf07f55ec2d0575dd817b64d27364bd24"} Dec 09 03:45:03 crc kubenswrapper[4766]: I1209 03:45:03.435969 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f81aaaa5c6967ee1d8e8c28d1f7e93cf07f55ec2d0575dd817b64d27364bd24" Dec 09 03:45:03 crc kubenswrapper[4766]: I1209 03:45:03.436013 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v" Dec 09 03:45:03 crc kubenswrapper[4766]: I1209 03:45:03.795728 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl"] Dec 09 03:45:03 crc kubenswrapper[4766]: I1209 03:45:03.803206 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420820-4thjl"] Dec 09 03:45:04 crc kubenswrapper[4766]: I1209 03:45:04.846466 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57e3485-ffb5-46c1-b8d6-2e0296b85fab" path="/var/lib/kubelet/pods/b57e3485-ffb5-46c1-b8d6-2e0296b85fab/volumes" Dec 09 03:45:07 crc kubenswrapper[4766]: I1209 03:45:07.317883 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:45:07 crc kubenswrapper[4766]: I1209 03:45:07.318183 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:45:11 crc kubenswrapper[4766]: I1209 03:45:11.891897 4766 scope.go:117] "RemoveContainer" containerID="703d5711eb0bd64eccef7cf273f21308b8e67a05b5225fbf8f645fe8cc618040" Dec 09 03:45:11 crc kubenswrapper[4766]: I1209 03:45:11.928178 4766 scope.go:117] "RemoveContainer" containerID="25a975fcaebea6149ff93cc7ae0f516d54a7143da50c55f98fc5ae61d5ecb604" Dec 09 03:45:11 crc kubenswrapper[4766]: I1209 03:45:11.981303 4766 scope.go:117] "RemoveContainer" containerID="0676e56c92cfee8a1d0d596375582bc5791b8e314503bd6233f9da366126d99f" Dec 09 03:45:12 crc kubenswrapper[4766]: I1209 03:45:12.023279 4766 scope.go:117] "RemoveContainer" containerID="ef3231ca6d75883603382318582483a2a49c6400d230ad1c3f2e0b1d23a36876" Dec 09 03:45:12 crc kubenswrapper[4766]: I1209 03:45:12.048472 4766 scope.go:117] "RemoveContainer" containerID="7a3e415d4be0c928fbd52e3bfdb8da0e258cbf44b903072d268634c023795eab" Dec 09 03:45:12 crc kubenswrapper[4766]: I1209 03:45:12.075751 4766 scope.go:117] "RemoveContainer" containerID="f3dc3e9efda4eb76a73048732b0e5928cb31c1081be17736b6813a8bd5fb7266" Dec 09 03:45:37 crc kubenswrapper[4766]: I1209 03:45:37.316350 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:45:37 crc kubenswrapper[4766]: I1209 03:45:37.316988 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:45:37 crc kubenswrapper[4766]: I1209 03:45:37.317054 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:45:37 crc kubenswrapper[4766]: I1209 03:45:37.317815 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b8ea9e58118db65df0a143bc4e6610e1e42d0f82ff9e57737d9c5cdb61f8871"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:45:37 crc kubenswrapper[4766]: I1209 03:45:37.317911 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://1b8ea9e58118db65df0a143bc4e6610e1e42d0f82ff9e57737d9c5cdb61f8871" gracePeriod=600 Dec 09 03:45:37 crc kubenswrapper[4766]: I1209 03:45:37.724995 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="1b8ea9e58118db65df0a143bc4e6610e1e42d0f82ff9e57737d9c5cdb61f8871" exitCode=0 Dec 09 03:45:37 crc kubenswrapper[4766]: I1209 03:45:37.725081 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"1b8ea9e58118db65df0a143bc4e6610e1e42d0f82ff9e57737d9c5cdb61f8871"} Dec 09 03:45:37 crc kubenswrapper[4766]: I1209 03:45:37.725438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825"} Dec 09 03:45:37 crc kubenswrapper[4766]: I1209 03:45:37.725466 4766 scope.go:117] "RemoveContainer" containerID="9e3db5ea650a27d84a0c2a93c5b031b734fcf06bd316c8f7cd3574a6083b9917" Dec 09 03:47:37 crc kubenswrapper[4766]: I1209 03:47:37.316303 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:47:37 crc kubenswrapper[4766]: I1209 03:47:37.316776 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:48:07 crc kubenswrapper[4766]: I1209 03:48:07.317334 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:48:07 crc kubenswrapper[4766]: I1209 03:48:07.317958 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:48:37 crc kubenswrapper[4766]: I1209 03:48:37.316534 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:48:37 crc kubenswrapper[4766]: I1209 03:48:37.317272 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:48:37 crc kubenswrapper[4766]: I1209 03:48:37.317349 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:48:37 crc kubenswrapper[4766]: I1209 03:48:37.318441 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:48:37 crc kubenswrapper[4766]: I1209 03:48:37.318551 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" gracePeriod=600 Dec 09 03:48:37 crc kubenswrapper[4766]: E1209 03:48:37.449536 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:48:37 crc kubenswrapper[4766]: I1209 03:48:37.602985 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" exitCode=0 Dec 09 03:48:37 crc kubenswrapper[4766]: I1209 03:48:37.603032 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825"} Dec 09 03:48:37 crc kubenswrapper[4766]: I1209 03:48:37.603074 4766 scope.go:117] "RemoveContainer" containerID="1b8ea9e58118db65df0a143bc4e6610e1e42d0f82ff9e57737d9c5cdb61f8871" Dec 09 03:48:37 crc kubenswrapper[4766]: I1209 03:48:37.604181 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:48:37 crc kubenswrapper[4766]: E1209 03:48:37.604832 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.317203 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-88gv6"] Dec 09 03:48:45 crc kubenswrapper[4766]: E1209 03:48:45.318103 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423a5f21-244b-4266-8d32-29468a4bf6f3" containerName="collect-profiles" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.318118 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="423a5f21-244b-4266-8d32-29468a4bf6f3" containerName="collect-profiles" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.318326 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="423a5f21-244b-4266-8d32-29468a4bf6f3" containerName="collect-profiles" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.319639 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.331987 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-88gv6"] Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.458453 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms2cp\" (UniqueName: \"kubernetes.io/projected/3d1221ec-c739-4ff7-8672-3e3428992e4a-kube-api-access-ms2cp\") pod \"redhat-operators-88gv6\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.458536 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-catalog-content\") pod \"redhat-operators-88gv6\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.458554 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-utilities\") pod \"redhat-operators-88gv6\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.560286 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-catalog-content\") pod \"redhat-operators-88gv6\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.560335 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-utilities\") pod \"redhat-operators-88gv6\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.560417 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms2cp\" (UniqueName: \"kubernetes.io/projected/3d1221ec-c739-4ff7-8672-3e3428992e4a-kube-api-access-ms2cp\") pod \"redhat-operators-88gv6\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.560864 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-utilities\") pod \"redhat-operators-88gv6\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.561013 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-catalog-content\") pod \"redhat-operators-88gv6\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.582629 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms2cp\" (UniqueName: \"kubernetes.io/projected/3d1221ec-c739-4ff7-8672-3e3428992e4a-kube-api-access-ms2cp\") pod \"redhat-operators-88gv6\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:45 crc kubenswrapper[4766]: I1209 03:48:45.639537 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:46 crc kubenswrapper[4766]: I1209 03:48:46.100472 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-88gv6"] Dec 09 03:48:46 crc kubenswrapper[4766]: I1209 03:48:46.673802 4766 generic.go:334] "Generic (PLEG): container finished" podID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerID="013daf0a9a8c8a821d4380dd80853efc25235427bead348be6cfd0a41197839e" exitCode=0 Dec 09 03:48:46 crc kubenswrapper[4766]: I1209 03:48:46.673854 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-88gv6" event={"ID":"3d1221ec-c739-4ff7-8672-3e3428992e4a","Type":"ContainerDied","Data":"013daf0a9a8c8a821d4380dd80853efc25235427bead348be6cfd0a41197839e"} Dec 09 03:48:46 crc kubenswrapper[4766]: I1209 03:48:46.674117 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-88gv6" event={"ID":"3d1221ec-c739-4ff7-8672-3e3428992e4a","Type":"ContainerStarted","Data":"00d3eedb2f4f43ba525c217177ff54b4eeec912b442082e55eb1cc9597b1ee1a"} Dec 09 03:48:46 crc kubenswrapper[4766]: I1209 03:48:46.675253 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 03:48:47 crc kubenswrapper[4766]: I1209 03:48:47.688737 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-88gv6" event={"ID":"3d1221ec-c739-4ff7-8672-3e3428992e4a","Type":"ContainerStarted","Data":"e97252e7ec9bf28b42bee87bb82bdd8f34d7df92a65ffcff8691c70121482302"} Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.308671 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4qmtd"] Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.312023 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.331265 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qmtd"] Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.409495 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-catalog-content\") pod \"redhat-marketplace-4qmtd\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.409618 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m87lh\" (UniqueName: \"kubernetes.io/projected/ba2ffbca-81c2-41d6-b743-a48b98d408a5-kube-api-access-m87lh\") pod \"redhat-marketplace-4qmtd\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.409692 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-utilities\") pod \"redhat-marketplace-4qmtd\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.511598 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-catalog-content\") pod \"redhat-marketplace-4qmtd\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.511680 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m87lh\" (UniqueName: \"kubernetes.io/projected/ba2ffbca-81c2-41d6-b743-a48b98d408a5-kube-api-access-m87lh\") pod \"redhat-marketplace-4qmtd\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.511730 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-utilities\") pod \"redhat-marketplace-4qmtd\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.512200 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-catalog-content\") pod \"redhat-marketplace-4qmtd\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.512316 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-utilities\") pod \"redhat-marketplace-4qmtd\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.530120 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m87lh\" (UniqueName: \"kubernetes.io/projected/ba2ffbca-81c2-41d6-b743-a48b98d408a5-kube-api-access-m87lh\") pod \"redhat-marketplace-4qmtd\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.640558 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.697261 4766 generic.go:334] "Generic (PLEG): container finished" podID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerID="e97252e7ec9bf28b42bee87bb82bdd8f34d7df92a65ffcff8691c70121482302" exitCode=0 Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.697305 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-88gv6" event={"ID":"3d1221ec-c739-4ff7-8672-3e3428992e4a","Type":"ContainerDied","Data":"e97252e7ec9bf28b42bee87bb82bdd8f34d7df92a65ffcff8691c70121482302"} Dec 09 03:48:48 crc kubenswrapper[4766]: I1209 03:48:48.977553 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qmtd"] Dec 09 03:48:49 crc kubenswrapper[4766]: I1209 03:48:49.706640 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerID="8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae" exitCode=0 Dec 09 03:48:49 crc kubenswrapper[4766]: I1209 03:48:49.706844 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qmtd" event={"ID":"ba2ffbca-81c2-41d6-b743-a48b98d408a5","Type":"ContainerDied","Data":"8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae"} Dec 09 03:48:49 crc kubenswrapper[4766]: I1209 03:48:49.707120 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qmtd" event={"ID":"ba2ffbca-81c2-41d6-b743-a48b98d408a5","Type":"ContainerStarted","Data":"f408a8afaad68c41d4746350596d689b4346ba04f3ee9e9e1afde6617e4f1c97"} Dec 09 03:48:49 crc kubenswrapper[4766]: I1209 03:48:49.711517 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-88gv6" event={"ID":"3d1221ec-c739-4ff7-8672-3e3428992e4a","Type":"ContainerStarted","Data":"bda7323d1fa9341fbba6d7a09b2c6002a8c00c57f5927f1417ab0feecc1ba0c5"} Dec 09 03:48:49 crc kubenswrapper[4766]: I1209 03:48:49.749707 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-88gv6" podStartSLOduration=2.319173747 podStartE2EDuration="4.749685661s" podCreationTimestamp="2025-12-09 03:48:45 +0000 UTC" firstStartedPulling="2025-12-09 03:48:46.675000958 +0000 UTC m=+2208.384306394" lastFinishedPulling="2025-12-09 03:48:49.105512882 +0000 UTC m=+2210.814818308" observedRunningTime="2025-12-09 03:48:49.746864285 +0000 UTC m=+2211.456169721" watchObservedRunningTime="2025-12-09 03:48:49.749685661 +0000 UTC m=+2211.458991087" Dec 09 03:48:50 crc kubenswrapper[4766]: I1209 03:48:50.720097 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerID="910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580" exitCode=0 Dec 09 03:48:50 crc kubenswrapper[4766]: I1209 03:48:50.720172 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qmtd" event={"ID":"ba2ffbca-81c2-41d6-b743-a48b98d408a5","Type":"ContainerDied","Data":"910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580"} Dec 09 03:48:51 crc kubenswrapper[4766]: I1209 03:48:51.727913 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qmtd" event={"ID":"ba2ffbca-81c2-41d6-b743-a48b98d408a5","Type":"ContainerStarted","Data":"8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9"} Dec 09 03:48:51 crc kubenswrapper[4766]: I1209 03:48:51.747628 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4qmtd" podStartSLOduration=2.34380826 podStartE2EDuration="3.747604813s" podCreationTimestamp="2025-12-09 03:48:48 +0000 UTC" firstStartedPulling="2025-12-09 03:48:49.708367806 +0000 UTC m=+2211.417673272" lastFinishedPulling="2025-12-09 03:48:51.112164349 +0000 UTC m=+2212.821469825" observedRunningTime="2025-12-09 03:48:51.743511682 +0000 UTC m=+2213.452817118" watchObservedRunningTime="2025-12-09 03:48:51.747604813 +0000 UTC m=+2213.456910239" Dec 09 03:48:51 crc kubenswrapper[4766]: I1209 03:48:51.838842 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:48:51 crc kubenswrapper[4766]: E1209 03:48:51.839069 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:48:55 crc kubenswrapper[4766]: I1209 03:48:55.640759 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:55 crc kubenswrapper[4766]: I1209 03:48:55.641282 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:55 crc kubenswrapper[4766]: I1209 03:48:55.703586 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:55 crc kubenswrapper[4766]: I1209 03:48:55.798279 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:48:56 crc kubenswrapper[4766]: I1209 03:48:56.492763 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-88gv6"] Dec 09 03:48:57 crc kubenswrapper[4766]: I1209 03:48:57.774785 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-88gv6" podUID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerName="registry-server" containerID="cri-o://bda7323d1fa9341fbba6d7a09b2c6002a8c00c57f5927f1417ab0feecc1ba0c5" gracePeriod=2 Dec 09 03:48:58 crc kubenswrapper[4766]: I1209 03:48:58.641669 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:58 crc kubenswrapper[4766]: I1209 03:48:58.642190 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:58 crc kubenswrapper[4766]: I1209 03:48:58.695713 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:58 crc kubenswrapper[4766]: I1209 03:48:58.821547 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:48:59 crc kubenswrapper[4766]: I1209 03:48:59.791659 4766 generic.go:334] "Generic (PLEG): container finished" podID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerID="bda7323d1fa9341fbba6d7a09b2c6002a8c00c57f5927f1417ab0feecc1ba0c5" exitCode=0 Dec 09 03:48:59 crc kubenswrapper[4766]: I1209 03:48:59.791742 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-88gv6" event={"ID":"3d1221ec-c739-4ff7-8672-3e3428992e4a","Type":"ContainerDied","Data":"bda7323d1fa9341fbba6d7a09b2c6002a8c00c57f5927f1417ab0feecc1ba0c5"} Dec 09 03:48:59 crc kubenswrapper[4766]: I1209 03:48:59.886929 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qmtd"] Dec 09 03:49:00 crc kubenswrapper[4766]: I1209 03:49:00.799104 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4qmtd" podUID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerName="registry-server" containerID="cri-o://8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9" gracePeriod=2 Dec 09 03:49:00 crc kubenswrapper[4766]: I1209 03:49:00.878466 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.016830 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-catalog-content\") pod \"3d1221ec-c739-4ff7-8672-3e3428992e4a\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.016917 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms2cp\" (UniqueName: \"kubernetes.io/projected/3d1221ec-c739-4ff7-8672-3e3428992e4a-kube-api-access-ms2cp\") pod \"3d1221ec-c739-4ff7-8672-3e3428992e4a\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.016993 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-utilities\") pod \"3d1221ec-c739-4ff7-8672-3e3428992e4a\" (UID: \"3d1221ec-c739-4ff7-8672-3e3428992e4a\") " Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.018461 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-utilities" (OuterVolumeSpecName: "utilities") pod "3d1221ec-c739-4ff7-8672-3e3428992e4a" (UID: "3d1221ec-c739-4ff7-8672-3e3428992e4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.027412 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1221ec-c739-4ff7-8672-3e3428992e4a-kube-api-access-ms2cp" (OuterVolumeSpecName: "kube-api-access-ms2cp") pod "3d1221ec-c739-4ff7-8672-3e3428992e4a" (UID: "3d1221ec-c739-4ff7-8672-3e3428992e4a"). InnerVolumeSpecName "kube-api-access-ms2cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.118917 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms2cp\" (UniqueName: \"kubernetes.io/projected/3d1221ec-c739-4ff7-8672-3e3428992e4a-kube-api-access-ms2cp\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.118953 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.153930 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d1221ec-c739-4ff7-8672-3e3428992e4a" (UID: "3d1221ec-c739-4ff7-8672-3e3428992e4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.167959 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.220035 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m87lh\" (UniqueName: \"kubernetes.io/projected/ba2ffbca-81c2-41d6-b743-a48b98d408a5-kube-api-access-m87lh\") pod \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.220765 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-utilities\") pod \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.220835 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-catalog-content\") pod \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\" (UID: \"ba2ffbca-81c2-41d6-b743-a48b98d408a5\") " Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.221760 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-utilities" (OuterVolumeSpecName: "utilities") pod "ba2ffbca-81c2-41d6-b743-a48b98d408a5" (UID: "ba2ffbca-81c2-41d6-b743-a48b98d408a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.225137 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2ffbca-81c2-41d6-b743-a48b98d408a5-kube-api-access-m87lh" (OuterVolumeSpecName: "kube-api-access-m87lh") pod "ba2ffbca-81c2-41d6-b743-a48b98d408a5" (UID: "ba2ffbca-81c2-41d6-b743-a48b98d408a5"). InnerVolumeSpecName "kube-api-access-m87lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.226968 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d1221ec-c739-4ff7-8672-3e3428992e4a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.227005 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m87lh\" (UniqueName: \"kubernetes.io/projected/ba2ffbca-81c2-41d6-b743-a48b98d408a5-kube-api-access-m87lh\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.227021 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.243379 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba2ffbca-81c2-41d6-b743-a48b98d408a5" (UID: "ba2ffbca-81c2-41d6-b743-a48b98d408a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.328423 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2ffbca-81c2-41d6-b743-a48b98d408a5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.808081 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-88gv6" event={"ID":"3d1221ec-c739-4ff7-8672-3e3428992e4a","Type":"ContainerDied","Data":"00d3eedb2f4f43ba525c217177ff54b4eeec912b442082e55eb1cc9597b1ee1a"} Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.808143 4766 scope.go:117] "RemoveContainer" containerID="bda7323d1fa9341fbba6d7a09b2c6002a8c00c57f5927f1417ab0feecc1ba0c5" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.808102 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-88gv6" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.812327 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerID="8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9" exitCode=0 Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.812379 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qmtd" event={"ID":"ba2ffbca-81c2-41d6-b743-a48b98d408a5","Type":"ContainerDied","Data":"8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9"} Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.812404 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qmtd" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.812423 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qmtd" event={"ID":"ba2ffbca-81c2-41d6-b743-a48b98d408a5","Type":"ContainerDied","Data":"f408a8afaad68c41d4746350596d689b4346ba04f3ee9e9e1afde6617e4f1c97"} Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.860034 4766 scope.go:117] "RemoveContainer" containerID="e97252e7ec9bf28b42bee87bb82bdd8f34d7df92a65ffcff8691c70121482302" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.866113 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qmtd"] Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.881357 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qmtd"] Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.884687 4766 scope.go:117] "RemoveContainer" containerID="013daf0a9a8c8a821d4380dd80853efc25235427bead348be6cfd0a41197839e" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.886369 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-88gv6"] Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.890703 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-88gv6"] Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.905540 4766 scope.go:117] "RemoveContainer" containerID="8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.925707 4766 scope.go:117] "RemoveContainer" containerID="910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.938753 4766 scope.go:117] "RemoveContainer" containerID="8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.970328 4766 scope.go:117] "RemoveContainer" containerID="8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9" Dec 09 03:49:01 crc kubenswrapper[4766]: E1209 03:49:01.970772 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9\": container with ID starting with 8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9 not found: ID does not exist" containerID="8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.970841 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9"} err="failed to get container status \"8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9\": rpc error: code = NotFound desc = could not find container \"8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9\": container with ID starting with 8ed31b9cb49756cef2dc0e0a11baca5cac9456ee00bce446f6a8f57ad26a11c9 not found: ID does not exist" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.970886 4766 scope.go:117] "RemoveContainer" containerID="910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580" Dec 09 03:49:01 crc kubenswrapper[4766]: E1209 03:49:01.971982 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580\": container with ID starting with 910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580 not found: ID does not exist" containerID="910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.972039 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580"} err="failed to get container status \"910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580\": rpc error: code = NotFound desc = could not find container \"910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580\": container with ID starting with 910f0e42691e913ceeda07b1d115bda161fa0afcd9d9e2adad74bc29b63e3580 not found: ID does not exist" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.972076 4766 scope.go:117] "RemoveContainer" containerID="8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae" Dec 09 03:49:01 crc kubenswrapper[4766]: E1209 03:49:01.972811 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae\": container with ID starting with 8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae not found: ID does not exist" containerID="8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae" Dec 09 03:49:01 crc kubenswrapper[4766]: I1209 03:49:01.972862 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae"} err="failed to get container status \"8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae\": rpc error: code = NotFound desc = could not find container \"8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae\": container with ID starting with 8732c22df93caf812345089d229db8e02e8b2ce6ebd4e938c95567685d2103ae not found: ID does not exist" Dec 09 03:49:02 crc kubenswrapper[4766]: I1209 03:49:02.840056 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:49:02 crc kubenswrapper[4766]: E1209 03:49:02.840427 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:49:02 crc kubenswrapper[4766]: I1209 03:49:02.850334 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1221ec-c739-4ff7-8672-3e3428992e4a" path="/var/lib/kubelet/pods/3d1221ec-c739-4ff7-8672-3e3428992e4a/volumes" Dec 09 03:49:02 crc kubenswrapper[4766]: I1209 03:49:02.851681 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" path="/var/lib/kubelet/pods/ba2ffbca-81c2-41d6-b743-a48b98d408a5/volumes" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.299956 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k8lmw"] Dec 09 03:49:05 crc kubenswrapper[4766]: E1209 03:49:05.300589 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerName="extract-utilities" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.300603 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerName="extract-utilities" Dec 09 03:49:05 crc kubenswrapper[4766]: E1209 03:49:05.300622 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerName="registry-server" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.300630 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerName="registry-server" Dec 09 03:49:05 crc kubenswrapper[4766]: E1209 03:49:05.300647 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerName="registry-server" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.300655 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerName="registry-server" Dec 09 03:49:05 crc kubenswrapper[4766]: E1209 03:49:05.300676 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerName="extract-content" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.300684 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerName="extract-content" Dec 09 03:49:05 crc kubenswrapper[4766]: E1209 03:49:05.300696 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerName="extract-utilities" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.300704 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerName="extract-utilities" Dec 09 03:49:05 crc kubenswrapper[4766]: E1209 03:49:05.300719 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerName="extract-content" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.300727 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerName="extract-content" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.300871 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2ffbca-81c2-41d6-b743-a48b98d408a5" containerName="registry-server" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.300898 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1221ec-c739-4ff7-8672-3e3428992e4a" containerName="registry-server" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.302147 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.324387 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8lmw"] Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.390074 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-catalog-content\") pod \"community-operators-k8lmw\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.390156 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r7t7\" (UniqueName: \"kubernetes.io/projected/1755e6ca-cd09-4b44-9aa6-bf29789c9377-kube-api-access-9r7t7\") pod \"community-operators-k8lmw\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.390507 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-utilities\") pod \"community-operators-k8lmw\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.491729 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-catalog-content\") pod \"community-operators-k8lmw\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.491799 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r7t7\" (UniqueName: \"kubernetes.io/projected/1755e6ca-cd09-4b44-9aa6-bf29789c9377-kube-api-access-9r7t7\") pod \"community-operators-k8lmw\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.491887 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-utilities\") pod \"community-operators-k8lmw\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.492397 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-catalog-content\") pod \"community-operators-k8lmw\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.492429 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-utilities\") pod \"community-operators-k8lmw\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.516354 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r7t7\" (UniqueName: \"kubernetes.io/projected/1755e6ca-cd09-4b44-9aa6-bf29789c9377-kube-api-access-9r7t7\") pod \"community-operators-k8lmw\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:05 crc kubenswrapper[4766]: I1209 03:49:05.674609 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.147759 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8lmw"] Dec 09 03:49:06 crc kubenswrapper[4766]: W1209 03:49:06.154709 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1755e6ca_cd09_4b44_9aa6_bf29789c9377.slice/crio-1e467c2859c3790aef2356c80d9f94ceee2dacbfa9e26f7625bae99867c57c7e WatchSource:0}: Error finding container 1e467c2859c3790aef2356c80d9f94ceee2dacbfa9e26f7625bae99867c57c7e: Status 404 returned error can't find the container with id 1e467c2859c3790aef2356c80d9f94ceee2dacbfa9e26f7625bae99867c57c7e Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.322837 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-264gd"] Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.326703 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.333821 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-264gd"] Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.408617 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-catalog-content\") pod \"certified-operators-264gd\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.408681 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-utilities\") pod \"certified-operators-264gd\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.408740 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkgz8\" (UniqueName: \"kubernetes.io/projected/ca21492d-3963-436f-9b85-91169c8e91be-kube-api-access-lkgz8\") pod \"certified-operators-264gd\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.510655 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-catalog-content\") pod \"certified-operators-264gd\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.510710 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-utilities\") pod \"certified-operators-264gd\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.510756 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkgz8\" (UniqueName: \"kubernetes.io/projected/ca21492d-3963-436f-9b85-91169c8e91be-kube-api-access-lkgz8\") pod \"certified-operators-264gd\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.511237 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-catalog-content\") pod \"certified-operators-264gd\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.511357 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-utilities\") pod \"certified-operators-264gd\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.528765 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkgz8\" (UniqueName: \"kubernetes.io/projected/ca21492d-3963-436f-9b85-91169c8e91be-kube-api-access-lkgz8\") pod \"certified-operators-264gd\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.645429 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.853863 4766 generic.go:334] "Generic (PLEG): container finished" podID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerID="331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603" exitCode=0 Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.853921 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8lmw" event={"ID":"1755e6ca-cd09-4b44-9aa6-bf29789c9377","Type":"ContainerDied","Data":"331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603"} Dec 09 03:49:06 crc kubenswrapper[4766]: I1209 03:49:06.854106 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8lmw" event={"ID":"1755e6ca-cd09-4b44-9aa6-bf29789c9377","Type":"ContainerStarted","Data":"1e467c2859c3790aef2356c80d9f94ceee2dacbfa9e26f7625bae99867c57c7e"} Dec 09 03:49:07 crc kubenswrapper[4766]: I1209 03:49:07.080791 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-264gd"] Dec 09 03:49:07 crc kubenswrapper[4766]: W1209 03:49:07.089356 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca21492d_3963_436f_9b85_91169c8e91be.slice/crio-1960090d08d5b6846c23c324a1c7769789882ef28bafb8898632a88da2ed2e44 WatchSource:0}: Error finding container 1960090d08d5b6846c23c324a1c7769789882ef28bafb8898632a88da2ed2e44: Status 404 returned error can't find the container with id 1960090d08d5b6846c23c324a1c7769789882ef28bafb8898632a88da2ed2e44 Dec 09 03:49:07 crc kubenswrapper[4766]: I1209 03:49:07.868126 4766 generic.go:334] "Generic (PLEG): container finished" podID="ca21492d-3963-436f-9b85-91169c8e91be" containerID="4a67f029219963e91514d9b5fecb356b00269a7bb708b9eb73541229075c2dd5" exitCode=0 Dec 09 03:49:07 crc kubenswrapper[4766]: I1209 03:49:07.868454 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-264gd" event={"ID":"ca21492d-3963-436f-9b85-91169c8e91be","Type":"ContainerDied","Data":"4a67f029219963e91514d9b5fecb356b00269a7bb708b9eb73541229075c2dd5"} Dec 09 03:49:07 crc kubenswrapper[4766]: I1209 03:49:07.868775 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-264gd" event={"ID":"ca21492d-3963-436f-9b85-91169c8e91be","Type":"ContainerStarted","Data":"1960090d08d5b6846c23c324a1c7769789882ef28bafb8898632a88da2ed2e44"} Dec 09 03:49:07 crc kubenswrapper[4766]: I1209 03:49:07.872652 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8lmw" event={"ID":"1755e6ca-cd09-4b44-9aa6-bf29789c9377","Type":"ContainerStarted","Data":"356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b"} Dec 09 03:49:08 crc kubenswrapper[4766]: I1209 03:49:08.887897 4766 generic.go:334] "Generic (PLEG): container finished" podID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerID="356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b" exitCode=0 Dec 09 03:49:08 crc kubenswrapper[4766]: I1209 03:49:08.887949 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8lmw" event={"ID":"1755e6ca-cd09-4b44-9aa6-bf29789c9377","Type":"ContainerDied","Data":"356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b"} Dec 09 03:49:09 crc kubenswrapper[4766]: I1209 03:49:09.899497 4766 generic.go:334] "Generic (PLEG): container finished" podID="ca21492d-3963-436f-9b85-91169c8e91be" containerID="c1a115117a838714d53cefb39ce6426b89bc60e9a6b61a2189661d7216dc2fbf" exitCode=0 Dec 09 03:49:09 crc kubenswrapper[4766]: I1209 03:49:09.899625 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-264gd" event={"ID":"ca21492d-3963-436f-9b85-91169c8e91be","Type":"ContainerDied","Data":"c1a115117a838714d53cefb39ce6426b89bc60e9a6b61a2189661d7216dc2fbf"} Dec 09 03:49:09 crc kubenswrapper[4766]: I1209 03:49:09.903470 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8lmw" event={"ID":"1755e6ca-cd09-4b44-9aa6-bf29789c9377","Type":"ContainerStarted","Data":"53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360"} Dec 09 03:49:09 crc kubenswrapper[4766]: I1209 03:49:09.942788 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k8lmw" podStartSLOduration=2.5336931209999998 podStartE2EDuration="4.942770106s" podCreationTimestamp="2025-12-09 03:49:05 +0000 UTC" firstStartedPulling="2025-12-09 03:49:06.856057199 +0000 UTC m=+2228.565362625" lastFinishedPulling="2025-12-09 03:49:09.265134174 +0000 UTC m=+2230.974439610" observedRunningTime="2025-12-09 03:49:09.938772378 +0000 UTC m=+2231.648077804" watchObservedRunningTime="2025-12-09 03:49:09.942770106 +0000 UTC m=+2231.652075532" Dec 09 03:49:10 crc kubenswrapper[4766]: I1209 03:49:10.913300 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-264gd" event={"ID":"ca21492d-3963-436f-9b85-91169c8e91be","Type":"ContainerStarted","Data":"b6d42b3cb4b1433fe5964cfea2b7a8904f11610ace1812c76cd458070986b3f6"} Dec 09 03:49:10 crc kubenswrapper[4766]: I1209 03:49:10.939947 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-264gd" podStartSLOduration=2.515653522 podStartE2EDuration="4.939927648s" podCreationTimestamp="2025-12-09 03:49:06 +0000 UTC" firstStartedPulling="2025-12-09 03:49:07.879818498 +0000 UTC m=+2229.589123964" lastFinishedPulling="2025-12-09 03:49:10.304092664 +0000 UTC m=+2232.013398090" observedRunningTime="2025-12-09 03:49:10.932349224 +0000 UTC m=+2232.641654680" watchObservedRunningTime="2025-12-09 03:49:10.939927648 +0000 UTC m=+2232.649233084" Dec 09 03:49:15 crc kubenswrapper[4766]: I1209 03:49:15.675316 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:15 crc kubenswrapper[4766]: I1209 03:49:15.675810 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:15 crc kubenswrapper[4766]: I1209 03:49:15.740002 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:15 crc kubenswrapper[4766]: I1209 03:49:15.839370 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:49:15 crc kubenswrapper[4766]: E1209 03:49:15.839788 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:49:16 crc kubenswrapper[4766]: I1209 03:49:16.010736 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:16 crc kubenswrapper[4766]: I1209 03:49:16.646535 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:16 crc kubenswrapper[4766]: I1209 03:49:16.646644 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:16 crc kubenswrapper[4766]: I1209 03:49:16.710955 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:17 crc kubenswrapper[4766]: I1209 03:49:17.043580 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:17 crc kubenswrapper[4766]: I1209 03:49:17.288200 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8lmw"] Dec 09 03:49:17 crc kubenswrapper[4766]: I1209 03:49:17.991731 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k8lmw" podUID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerName="registry-server" containerID="cri-o://53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360" gracePeriod=2 Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.705289 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.792090 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r7t7\" (UniqueName: \"kubernetes.io/projected/1755e6ca-cd09-4b44-9aa6-bf29789c9377-kube-api-access-9r7t7\") pod \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.792194 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-utilities\") pod \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.792264 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-catalog-content\") pod \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\" (UID: \"1755e6ca-cd09-4b44-9aa6-bf29789c9377\") " Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.793608 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-utilities" (OuterVolumeSpecName: "utilities") pod "1755e6ca-cd09-4b44-9aa6-bf29789c9377" (UID: "1755e6ca-cd09-4b44-9aa6-bf29789c9377"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.811574 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1755e6ca-cd09-4b44-9aa6-bf29789c9377-kube-api-access-9r7t7" (OuterVolumeSpecName: "kube-api-access-9r7t7") pod "1755e6ca-cd09-4b44-9aa6-bf29789c9377" (UID: "1755e6ca-cd09-4b44-9aa6-bf29789c9377"). InnerVolumeSpecName "kube-api-access-9r7t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.851246 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1755e6ca-cd09-4b44-9aa6-bf29789c9377" (UID: "1755e6ca-cd09-4b44-9aa6-bf29789c9377"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.893969 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r7t7\" (UniqueName: \"kubernetes.io/projected/1755e6ca-cd09-4b44-9aa6-bf29789c9377-kube-api-access-9r7t7\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.893999 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:18 crc kubenswrapper[4766]: I1209 03:49:18.894010 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1755e6ca-cd09-4b44-9aa6-bf29789c9377-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.002517 4766 generic.go:334] "Generic (PLEG): container finished" podID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerID="53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360" exitCode=0 Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.002594 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8lmw" event={"ID":"1755e6ca-cd09-4b44-9aa6-bf29789c9377","Type":"ContainerDied","Data":"53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360"} Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.002634 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8lmw" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.003515 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8lmw" event={"ID":"1755e6ca-cd09-4b44-9aa6-bf29789c9377","Type":"ContainerDied","Data":"1e467c2859c3790aef2356c80d9f94ceee2dacbfa9e26f7625bae99867c57c7e"} Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.003573 4766 scope.go:117] "RemoveContainer" containerID="53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.028509 4766 scope.go:117] "RemoveContainer" containerID="356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.053241 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8lmw"] Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.057142 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k8lmw"] Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.077264 4766 scope.go:117] "RemoveContainer" containerID="331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.094115 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-264gd"] Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.094470 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-264gd" podUID="ca21492d-3963-436f-9b85-91169c8e91be" containerName="registry-server" containerID="cri-o://b6d42b3cb4b1433fe5964cfea2b7a8904f11610ace1812c76cd458070986b3f6" gracePeriod=2 Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.108641 4766 scope.go:117] "RemoveContainer" containerID="53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360" Dec 09 03:49:19 crc kubenswrapper[4766]: E1209 03:49:19.109229 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360\": container with ID starting with 53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360 not found: ID does not exist" containerID="53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.109262 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360"} err="failed to get container status \"53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360\": rpc error: code = NotFound desc = could not find container \"53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360\": container with ID starting with 53e4209ec1f23f6130f5dd50788266ac7c9269ce5fa8386aee13861b07255360 not found: ID does not exist" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.109310 4766 scope.go:117] "RemoveContainer" containerID="356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b" Dec 09 03:49:19 crc kubenswrapper[4766]: E1209 03:49:19.109999 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b\": container with ID starting with 356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b not found: ID does not exist" containerID="356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.110048 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b"} err="failed to get container status \"356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b\": rpc error: code = NotFound desc = could not find container \"356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b\": container with ID starting with 356f8acb520e590ac2808a174b3a391bd6b7dc8d7ef0821d2589eef42c69f18b not found: ID does not exist" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.110064 4766 scope.go:117] "RemoveContainer" containerID="331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603" Dec 09 03:49:19 crc kubenswrapper[4766]: E1209 03:49:19.110386 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603\": container with ID starting with 331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603 not found: ID does not exist" containerID="331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603" Dec 09 03:49:19 crc kubenswrapper[4766]: I1209 03:49:19.110402 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603"} err="failed to get container status \"331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603\": rpc error: code = NotFound desc = could not find container \"331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603\": container with ID starting with 331b390b333ab6cb89ce339316a5dd2704e90666c83a8a7d7448a12ec4454603 not found: ID does not exist" Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.020269 4766 generic.go:334] "Generic (PLEG): container finished" podID="ca21492d-3963-436f-9b85-91169c8e91be" containerID="b6d42b3cb4b1433fe5964cfea2b7a8904f11610ace1812c76cd458070986b3f6" exitCode=0 Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.020372 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-264gd" event={"ID":"ca21492d-3963-436f-9b85-91169c8e91be","Type":"ContainerDied","Data":"b6d42b3cb4b1433fe5964cfea2b7a8904f11610ace1812c76cd458070986b3f6"} Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.644868 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.719200 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-catalog-content\") pod \"ca21492d-3963-436f-9b85-91169c8e91be\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.719279 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-utilities\") pod \"ca21492d-3963-436f-9b85-91169c8e91be\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.719311 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkgz8\" (UniqueName: \"kubernetes.io/projected/ca21492d-3963-436f-9b85-91169c8e91be-kube-api-access-lkgz8\") pod \"ca21492d-3963-436f-9b85-91169c8e91be\" (UID: \"ca21492d-3963-436f-9b85-91169c8e91be\") " Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.720276 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-utilities" (OuterVolumeSpecName: "utilities") pod "ca21492d-3963-436f-9b85-91169c8e91be" (UID: "ca21492d-3963-436f-9b85-91169c8e91be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.720572 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.730638 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca21492d-3963-436f-9b85-91169c8e91be-kube-api-access-lkgz8" (OuterVolumeSpecName: "kube-api-access-lkgz8") pod "ca21492d-3963-436f-9b85-91169c8e91be" (UID: "ca21492d-3963-436f-9b85-91169c8e91be"). InnerVolumeSpecName "kube-api-access-lkgz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.775024 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca21492d-3963-436f-9b85-91169c8e91be" (UID: "ca21492d-3963-436f-9b85-91169c8e91be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.821906 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkgz8\" (UniqueName: \"kubernetes.io/projected/ca21492d-3963-436f-9b85-91169c8e91be-kube-api-access-lkgz8\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.822242 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca21492d-3963-436f-9b85-91169c8e91be-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:49:20 crc kubenswrapper[4766]: I1209 03:49:20.853577 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" path="/var/lib/kubelet/pods/1755e6ca-cd09-4b44-9aa6-bf29789c9377/volumes" Dec 09 03:49:21 crc kubenswrapper[4766]: I1209 03:49:21.029569 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-264gd" event={"ID":"ca21492d-3963-436f-9b85-91169c8e91be","Type":"ContainerDied","Data":"1960090d08d5b6846c23c324a1c7769789882ef28bafb8898632a88da2ed2e44"} Dec 09 03:49:21 crc kubenswrapper[4766]: I1209 03:49:21.030548 4766 scope.go:117] "RemoveContainer" containerID="b6d42b3cb4b1433fe5964cfea2b7a8904f11610ace1812c76cd458070986b3f6" Dec 09 03:49:21 crc kubenswrapper[4766]: I1209 03:49:21.030047 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-264gd" Dec 09 03:49:21 crc kubenswrapper[4766]: I1209 03:49:21.055110 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-264gd"] Dec 09 03:49:21 crc kubenswrapper[4766]: I1209 03:49:21.057115 4766 scope.go:117] "RemoveContainer" containerID="c1a115117a838714d53cefb39ce6426b89bc60e9a6b61a2189661d7216dc2fbf" Dec 09 03:49:21 crc kubenswrapper[4766]: I1209 03:49:21.062345 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-264gd"] Dec 09 03:49:21 crc kubenswrapper[4766]: I1209 03:49:21.072571 4766 scope.go:117] "RemoveContainer" containerID="4a67f029219963e91514d9b5fecb356b00269a7bb708b9eb73541229075c2dd5" Dec 09 03:49:22 crc kubenswrapper[4766]: I1209 03:49:22.855794 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca21492d-3963-436f-9b85-91169c8e91be" path="/var/lib/kubelet/pods/ca21492d-3963-436f-9b85-91169c8e91be/volumes" Dec 09 03:49:27 crc kubenswrapper[4766]: I1209 03:49:27.839420 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:49:27 crc kubenswrapper[4766]: E1209 03:49:27.840810 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:49:40 crc kubenswrapper[4766]: I1209 03:49:40.839316 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:49:40 crc kubenswrapper[4766]: E1209 03:49:40.840106 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:49:51 crc kubenswrapper[4766]: I1209 03:49:51.839482 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:49:51 crc kubenswrapper[4766]: E1209 03:49:51.840435 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:50:03 crc kubenswrapper[4766]: I1209 03:50:03.840303 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:50:03 crc kubenswrapper[4766]: E1209 03:50:03.841455 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:50:14 crc kubenswrapper[4766]: I1209 03:50:14.840178 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:50:14 crc kubenswrapper[4766]: E1209 03:50:14.842069 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:50:25 crc kubenswrapper[4766]: I1209 03:50:25.839684 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:50:25 crc kubenswrapper[4766]: E1209 03:50:25.840509 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:50:36 crc kubenswrapper[4766]: I1209 03:50:36.839759 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:50:36 crc kubenswrapper[4766]: E1209 03:50:36.840657 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:50:50 crc kubenswrapper[4766]: I1209 03:50:50.839736 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:50:50 crc kubenswrapper[4766]: E1209 03:50:50.840535 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:51:04 crc kubenswrapper[4766]: I1209 03:51:04.840074 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:51:04 crc kubenswrapper[4766]: E1209 03:51:04.840850 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:51:16 crc kubenswrapper[4766]: I1209 03:51:16.839572 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:51:16 crc kubenswrapper[4766]: E1209 03:51:16.840992 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:51:31 crc kubenswrapper[4766]: I1209 03:51:31.839136 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:51:31 crc kubenswrapper[4766]: E1209 03:51:31.840196 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:51:42 crc kubenswrapper[4766]: I1209 03:51:42.839954 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:51:42 crc kubenswrapper[4766]: E1209 03:51:42.840672 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:51:56 crc kubenswrapper[4766]: I1209 03:51:56.839142 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:51:56 crc kubenswrapper[4766]: E1209 03:51:56.840332 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:52:11 crc kubenswrapper[4766]: I1209 03:52:11.839860 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:52:11 crc kubenswrapper[4766]: E1209 03:52:11.841014 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:52:23 crc kubenswrapper[4766]: I1209 03:52:23.840294 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:52:23 crc kubenswrapper[4766]: E1209 03:52:23.841722 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:52:38 crc kubenswrapper[4766]: I1209 03:52:38.845058 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:52:38 crc kubenswrapper[4766]: E1209 03:52:38.845857 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:52:49 crc kubenswrapper[4766]: I1209 03:52:49.840571 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:52:49 crc kubenswrapper[4766]: E1209 03:52:49.841407 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:53:02 crc kubenswrapper[4766]: I1209 03:53:02.845016 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:53:02 crc kubenswrapper[4766]: E1209 03:53:02.845727 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:53:17 crc kubenswrapper[4766]: I1209 03:53:17.839789 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:53:17 crc kubenswrapper[4766]: E1209 03:53:17.840803 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:53:28 crc kubenswrapper[4766]: I1209 03:53:28.844976 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:53:28 crc kubenswrapper[4766]: E1209 03:53:28.845901 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 03:53:40 crc kubenswrapper[4766]: I1209 03:53:40.839572 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:53:41 crc kubenswrapper[4766]: I1209 03:53:41.498313 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"08aaab589a25b1eb0b46b67b3f7484b5d02189326221ce164d8ab2c12ae184eb"} Dec 09 03:56:07 crc kubenswrapper[4766]: I1209 03:56:07.316594 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:56:07 crc kubenswrapper[4766]: I1209 03:56:07.317161 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:56:37 crc kubenswrapper[4766]: I1209 03:56:37.316863 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:56:37 crc kubenswrapper[4766]: I1209 03:56:37.317658 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:57:07 crc kubenswrapper[4766]: I1209 03:57:07.316824 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:57:07 crc kubenswrapper[4766]: I1209 03:57:07.317457 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:57:07 crc kubenswrapper[4766]: I1209 03:57:07.317512 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 03:57:07 crc kubenswrapper[4766]: I1209 03:57:07.318257 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08aaab589a25b1eb0b46b67b3f7484b5d02189326221ce164d8ab2c12ae184eb"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 03:57:07 crc kubenswrapper[4766]: I1209 03:57:07.318328 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://08aaab589a25b1eb0b46b67b3f7484b5d02189326221ce164d8ab2c12ae184eb" gracePeriod=600 Dec 09 03:57:08 crc kubenswrapper[4766]: I1209 03:57:08.265519 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="08aaab589a25b1eb0b46b67b3f7484b5d02189326221ce164d8ab2c12ae184eb" exitCode=0 Dec 09 03:57:08 crc kubenswrapper[4766]: I1209 03:57:08.265580 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"08aaab589a25b1eb0b46b67b3f7484b5d02189326221ce164d8ab2c12ae184eb"} Dec 09 03:57:08 crc kubenswrapper[4766]: I1209 03:57:08.266397 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426"} Dec 09 03:57:08 crc kubenswrapper[4766]: I1209 03:57:08.266485 4766 scope.go:117] "RemoveContainer" containerID="56c434fb9d0cd9113ce6e438849eb6b236b26b73bbaf164154d89b4763635825" Dec 09 03:59:07 crc kubenswrapper[4766]: I1209 03:59:07.316196 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:59:07 crc kubenswrapper[4766]: I1209 03:59:07.317000 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:59:37 crc kubenswrapper[4766]: I1209 03:59:37.317111 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 03:59:37 crc kubenswrapper[4766]: I1209 03:59:37.317682 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.233085 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h6xhj"] Dec 09 03:59:46 crc kubenswrapper[4766]: E1209 03:59:46.234286 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca21492d-3963-436f-9b85-91169c8e91be" containerName="registry-server" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.234312 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca21492d-3963-436f-9b85-91169c8e91be" containerName="registry-server" Dec 09 03:59:46 crc kubenswrapper[4766]: E1209 03:59:46.234376 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerName="registry-server" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.234389 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerName="registry-server" Dec 09 03:59:46 crc kubenswrapper[4766]: E1209 03:59:46.234410 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca21492d-3963-436f-9b85-91169c8e91be" containerName="extract-content" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.234422 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca21492d-3963-436f-9b85-91169c8e91be" containerName="extract-content" Dec 09 03:59:46 crc kubenswrapper[4766]: E1209 03:59:46.234440 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerName="extract-utilities" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.234453 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerName="extract-utilities" Dec 09 03:59:46 crc kubenswrapper[4766]: E1209 03:59:46.234478 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerName="extract-content" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.234490 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerName="extract-content" Dec 09 03:59:46 crc kubenswrapper[4766]: E1209 03:59:46.234512 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca21492d-3963-436f-9b85-91169c8e91be" containerName="extract-utilities" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.234524 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca21492d-3963-436f-9b85-91169c8e91be" containerName="extract-utilities" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.234839 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca21492d-3963-436f-9b85-91169c8e91be" containerName="registry-server" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.234898 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1755e6ca-cd09-4b44-9aa6-bf29789c9377" containerName="registry-server" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.236906 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.248335 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6xhj"] Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.436015 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-utilities\") pod \"redhat-marketplace-h6xhj\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.436104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlth8\" (UniqueName: \"kubernetes.io/projected/9b0279f6-bc45-40e8-a851-27ac3e617222-kube-api-access-rlth8\") pod \"redhat-marketplace-h6xhj\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.436125 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-catalog-content\") pod \"redhat-marketplace-h6xhj\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.538114 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-utilities\") pod \"redhat-marketplace-h6xhj\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.538248 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlth8\" (UniqueName: \"kubernetes.io/projected/9b0279f6-bc45-40e8-a851-27ac3e617222-kube-api-access-rlth8\") pod \"redhat-marketplace-h6xhj\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.538277 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-catalog-content\") pod \"redhat-marketplace-h6xhj\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.538923 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-catalog-content\") pod \"redhat-marketplace-h6xhj\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.538932 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-utilities\") pod \"redhat-marketplace-h6xhj\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.560861 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlth8\" (UniqueName: \"kubernetes.io/projected/9b0279f6-bc45-40e8-a851-27ac3e617222-kube-api-access-rlth8\") pod \"redhat-marketplace-h6xhj\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:46 crc kubenswrapper[4766]: I1209 03:59:46.607529 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:47 crc kubenswrapper[4766]: I1209 03:59:47.027954 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6xhj"] Dec 09 03:59:47 crc kubenswrapper[4766]: W1209 03:59:47.037851 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0279f6_bc45_40e8_a851_27ac3e617222.slice/crio-1e95bb5c4ad02b7877232969e31c3e92ee4d02aaed192270468cf0742269b33e WatchSource:0}: Error finding container 1e95bb5c4ad02b7877232969e31c3e92ee4d02aaed192270468cf0742269b33e: Status 404 returned error can't find the container with id 1e95bb5c4ad02b7877232969e31c3e92ee4d02aaed192270468cf0742269b33e Dec 09 03:59:47 crc kubenswrapper[4766]: I1209 03:59:47.822386 4766 generic.go:334] "Generic (PLEG): container finished" podID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerID="36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12" exitCode=0 Dec 09 03:59:47 crc kubenswrapper[4766]: I1209 03:59:47.822796 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6xhj" event={"ID":"9b0279f6-bc45-40e8-a851-27ac3e617222","Type":"ContainerDied","Data":"36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12"} Dec 09 03:59:47 crc kubenswrapper[4766]: I1209 03:59:47.822836 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6xhj" event={"ID":"9b0279f6-bc45-40e8-a851-27ac3e617222","Type":"ContainerStarted","Data":"1e95bb5c4ad02b7877232969e31c3e92ee4d02aaed192270468cf0742269b33e"} Dec 09 03:59:47 crc kubenswrapper[4766]: I1209 03:59:47.825126 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 03:59:48 crc kubenswrapper[4766]: I1209 03:59:48.833087 4766 generic.go:334] "Generic (PLEG): container finished" podID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerID="cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83" exitCode=0 Dec 09 03:59:48 crc kubenswrapper[4766]: I1209 03:59:48.833167 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6xhj" event={"ID":"9b0279f6-bc45-40e8-a851-27ac3e617222","Type":"ContainerDied","Data":"cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83"} Dec 09 03:59:49 crc kubenswrapper[4766]: I1209 03:59:49.845993 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6xhj" event={"ID":"9b0279f6-bc45-40e8-a851-27ac3e617222","Type":"ContainerStarted","Data":"6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7"} Dec 09 03:59:49 crc kubenswrapper[4766]: I1209 03:59:49.897352 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h6xhj" podStartSLOduration=2.493310399 podStartE2EDuration="3.897332928s" podCreationTimestamp="2025-12-09 03:59:46 +0000 UTC" firstStartedPulling="2025-12-09 03:59:47.824890523 +0000 UTC m=+2869.534195959" lastFinishedPulling="2025-12-09 03:59:49.228913032 +0000 UTC m=+2870.938218488" observedRunningTime="2025-12-09 03:59:49.871787403 +0000 UTC m=+2871.581092869" watchObservedRunningTime="2025-12-09 03:59:49.897332928 +0000 UTC m=+2871.606638354" Dec 09 03:59:56 crc kubenswrapper[4766]: I1209 03:59:56.607800 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:56 crc kubenswrapper[4766]: I1209 03:59:56.608435 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:56 crc kubenswrapper[4766]: I1209 03:59:56.656719 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:56 crc kubenswrapper[4766]: I1209 03:59:56.939924 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:56 crc kubenswrapper[4766]: I1209 03:59:56.980197 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6xhj"] Dec 09 03:59:58 crc kubenswrapper[4766]: I1209 03:59:58.915720 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h6xhj" podUID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerName="registry-server" containerID="cri-o://6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7" gracePeriod=2 Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.317643 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5pjsr"] Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.319685 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.330370 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5pjsr"] Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.343337 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl7kp\" (UniqueName: \"kubernetes.io/projected/36e8a294-0810-4bf5-99eb-72ef3bf74932-kube-api-access-pl7kp\") pod \"redhat-operators-5pjsr\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.343381 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-utilities\") pod \"redhat-operators-5pjsr\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.343444 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-catalog-content\") pod \"redhat-operators-5pjsr\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.445254 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-utilities\") pod \"redhat-operators-5pjsr\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.445371 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-catalog-content\") pod \"redhat-operators-5pjsr\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.445422 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl7kp\" (UniqueName: \"kubernetes.io/projected/36e8a294-0810-4bf5-99eb-72ef3bf74932-kube-api-access-pl7kp\") pod \"redhat-operators-5pjsr\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.446167 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-utilities\") pod \"redhat-operators-5pjsr\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.446309 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-catalog-content\") pod \"redhat-operators-5pjsr\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.468743 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl7kp\" (UniqueName: \"kubernetes.io/projected/36e8a294-0810-4bf5-99eb-72ef3bf74932-kube-api-access-pl7kp\") pod \"redhat-operators-5pjsr\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.643516 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.844703 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.852020 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-catalog-content\") pod \"9b0279f6-bc45-40e8-a851-27ac3e617222\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.852144 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-utilities\") pod \"9b0279f6-bc45-40e8-a851-27ac3e617222\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.852202 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlth8\" (UniqueName: \"kubernetes.io/projected/9b0279f6-bc45-40e8-a851-27ac3e617222-kube-api-access-rlth8\") pod \"9b0279f6-bc45-40e8-a851-27ac3e617222\" (UID: \"9b0279f6-bc45-40e8-a851-27ac3e617222\") " Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.854002 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-utilities" (OuterVolumeSpecName: "utilities") pod "9b0279f6-bc45-40e8-a851-27ac3e617222" (UID: "9b0279f6-bc45-40e8-a851-27ac3e617222"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.860734 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0279f6-bc45-40e8-a851-27ac3e617222-kube-api-access-rlth8" (OuterVolumeSpecName: "kube-api-access-rlth8") pod "9b0279f6-bc45-40e8-a851-27ac3e617222" (UID: "9b0279f6-bc45-40e8-a851-27ac3e617222"). InnerVolumeSpecName "kube-api-access-rlth8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.877339 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b0279f6-bc45-40e8-a851-27ac3e617222" (UID: "9b0279f6-bc45-40e8-a851-27ac3e617222"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.924544 4766 generic.go:334] "Generic (PLEG): container finished" podID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerID="6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7" exitCode=0 Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.924590 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6xhj" event={"ID":"9b0279f6-bc45-40e8-a851-27ac3e617222","Type":"ContainerDied","Data":"6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7"} Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.924619 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6xhj" event={"ID":"9b0279f6-bc45-40e8-a851-27ac3e617222","Type":"ContainerDied","Data":"1e95bb5c4ad02b7877232969e31c3e92ee4d02aaed192270468cf0742269b33e"} Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.924636 4766 scope.go:117] "RemoveContainer" containerID="6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.924765 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6xhj" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.954292 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6xhj"] Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.954934 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.954953 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b0279f6-bc45-40e8-a851-27ac3e617222-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.954964 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlth8\" (UniqueName: \"kubernetes.io/projected/9b0279f6-bc45-40e8-a851-27ac3e617222-kube-api-access-rlth8\") on node \"crc\" DevicePath \"\"" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.958790 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6xhj"] Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.966164 4766 scope.go:117] "RemoveContainer" containerID="cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83" Dec 09 03:59:59 crc kubenswrapper[4766]: I1209 03:59:59.985867 4766 scope.go:117] "RemoveContainer" containerID="36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.002223 4766 scope.go:117] "RemoveContainer" containerID="6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7" Dec 09 04:00:00 crc kubenswrapper[4766]: E1209 04:00:00.002741 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7\": container with ID starting with 6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7 not found: ID does not exist" containerID="6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.002774 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7"} err="failed to get container status \"6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7\": rpc error: code = NotFound desc = could not find container \"6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7\": container with ID starting with 6d73efb86f72666e9bb15de6e3eee898e46546914dc8a85605315d3c8fea29b7 not found: ID does not exist" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.002799 4766 scope.go:117] "RemoveContainer" containerID="cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83" Dec 09 04:00:00 crc kubenswrapper[4766]: E1209 04:00:00.003194 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83\": container with ID starting with cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83 not found: ID does not exist" containerID="cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.003262 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83"} err="failed to get container status \"cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83\": rpc error: code = NotFound desc = could not find container \"cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83\": container with ID starting with cca4769f7cfa9b6053100f9a738e6efe21c10cf2f755ae96ce0bd2d175471b83 not found: ID does not exist" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.003276 4766 scope.go:117] "RemoveContainer" containerID="36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12" Dec 09 04:00:00 crc kubenswrapper[4766]: E1209 04:00:00.003602 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12\": container with ID starting with 36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12 not found: ID does not exist" containerID="36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.003647 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12"} err="failed to get container status \"36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12\": rpc error: code = NotFound desc = could not find container \"36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12\": container with ID starting with 36ccd13c7322b6642efd7c2a1d92674747494c14b3105753f34fc252e8278d12 not found: ID does not exist" Dec 09 04:00:00 crc kubenswrapper[4766]: W1209 04:00:00.152699 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e8a294_0810_4bf5_99eb_72ef3bf74932.slice/crio-47c01f4d010809a376afc38a08d3fbe93a4f4f21aba70814648b5a674e91957e WatchSource:0}: Error finding container 47c01f4d010809a376afc38a08d3fbe93a4f4f21aba70814648b5a674e91957e: Status 404 returned error can't find the container with id 47c01f4d010809a376afc38a08d3fbe93a4f4f21aba70814648b5a674e91957e Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.160054 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5pjsr"] Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.168408 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw"] Dec 09 04:00:00 crc kubenswrapper[4766]: E1209 04:00:00.168894 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerName="registry-server" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.168918 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerName="registry-server" Dec 09 04:00:00 crc kubenswrapper[4766]: E1209 04:00:00.168934 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerName="extract-content" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.168943 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerName="extract-content" Dec 09 04:00:00 crc kubenswrapper[4766]: E1209 04:00:00.168959 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerName="extract-utilities" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.168968 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerName="extract-utilities" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.169135 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0279f6-bc45-40e8-a851-27ac3e617222" containerName="registry-server" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.169775 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.173591 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.173712 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.182091 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw"] Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.257609 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-secret-volume\") pod \"collect-profiles-29420880-pk6nw\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.257695 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-config-volume\") pod \"collect-profiles-29420880-pk6nw\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.257724 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhh7v\" (UniqueName: \"kubernetes.io/projected/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-kube-api-access-lhh7v\") pod \"collect-profiles-29420880-pk6nw\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.358719 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-config-volume\") pod \"collect-profiles-29420880-pk6nw\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.359016 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhh7v\" (UniqueName: \"kubernetes.io/projected/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-kube-api-access-lhh7v\") pod \"collect-profiles-29420880-pk6nw\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.359146 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-secret-volume\") pod \"collect-profiles-29420880-pk6nw\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.359629 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-config-volume\") pod \"collect-profiles-29420880-pk6nw\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.365549 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-secret-volume\") pod \"collect-profiles-29420880-pk6nw\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.381888 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhh7v\" (UniqueName: \"kubernetes.io/projected/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-kube-api-access-lhh7v\") pod \"collect-profiles-29420880-pk6nw\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.496600 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.849654 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b0279f6-bc45-40e8-a851-27ac3e617222" path="/var/lib/kubelet/pods/9b0279f6-bc45-40e8-a851-27ac3e617222/volumes" Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.917703 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw"] Dec 09 04:00:00 crc kubenswrapper[4766]: W1209 04:00:00.920023 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode474f6e1_79e8_41e7_a86f_c3eaf99d0cf9.slice/crio-4684f47bdebb565c84b88227a47d86248305ae7aa74b2d3cefa57b13eb9da424 WatchSource:0}: Error finding container 4684f47bdebb565c84b88227a47d86248305ae7aa74b2d3cefa57b13eb9da424: Status 404 returned error can't find the container with id 4684f47bdebb565c84b88227a47d86248305ae7aa74b2d3cefa57b13eb9da424 Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.932012 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" event={"ID":"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9","Type":"ContainerStarted","Data":"4684f47bdebb565c84b88227a47d86248305ae7aa74b2d3cefa57b13eb9da424"} Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.934974 4766 generic.go:334] "Generic (PLEG): container finished" podID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerID="db2ba3b370f2925bd575d8e7f764f7d3d9a3beaab1091b0e9bb7b8cec6024ae7" exitCode=0 Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.935009 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5pjsr" event={"ID":"36e8a294-0810-4bf5-99eb-72ef3bf74932","Type":"ContainerDied","Data":"db2ba3b370f2925bd575d8e7f764f7d3d9a3beaab1091b0e9bb7b8cec6024ae7"} Dec 09 04:00:00 crc kubenswrapper[4766]: I1209 04:00:00.935024 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5pjsr" event={"ID":"36e8a294-0810-4bf5-99eb-72ef3bf74932","Type":"ContainerStarted","Data":"47c01f4d010809a376afc38a08d3fbe93a4f4f21aba70814648b5a674e91957e"} Dec 09 04:00:01 crc kubenswrapper[4766]: I1209 04:00:01.943416 4766 generic.go:334] "Generic (PLEG): container finished" podID="e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9" containerID="8c006bbf0c9b96874dda4aed6fae029bc9865cd0d6ac71130032fa7525335a3b" exitCode=0 Dec 09 04:00:01 crc kubenswrapper[4766]: I1209 04:00:01.943490 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" event={"ID":"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9","Type":"ContainerDied","Data":"8c006bbf0c9b96874dda4aed6fae029bc9865cd0d6ac71130032fa7525335a3b"} Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.355195 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.441937 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhh7v\" (UniqueName: \"kubernetes.io/projected/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-kube-api-access-lhh7v\") pod \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.442240 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-config-volume\") pod \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.442395 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-secret-volume\") pod \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\" (UID: \"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9\") " Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.442935 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-config-volume" (OuterVolumeSpecName: "config-volume") pod "e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9" (UID: "e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.447244 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9" (UID: "e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.447449 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-kube-api-access-lhh7v" (OuterVolumeSpecName: "kube-api-access-lhh7v") pod "e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9" (UID: "e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9"). InnerVolumeSpecName "kube-api-access-lhh7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.543956 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.543995 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhh7v\" (UniqueName: \"kubernetes.io/projected/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-kube-api-access-lhh7v\") on node \"crc\" DevicePath \"\"" Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.544006 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.961985 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" event={"ID":"e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9","Type":"ContainerDied","Data":"4684f47bdebb565c84b88227a47d86248305ae7aa74b2d3cefa57b13eb9da424"} Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.962385 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4684f47bdebb565c84b88227a47d86248305ae7aa74b2d3cefa57b13eb9da424" Dec 09 04:00:03 crc kubenswrapper[4766]: I1209 04:00:03.962204 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw" Dec 09 04:00:04 crc kubenswrapper[4766]: E1209 04:00:04.148692 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode474f6e1_79e8_41e7_a86f_c3eaf99d0cf9.slice\": RecentStats: unable to find data in memory cache]" Dec 09 04:00:04 crc kubenswrapper[4766]: I1209 04:00:04.440299 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm"] Dec 09 04:00:04 crc kubenswrapper[4766]: I1209 04:00:04.447291 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420835-mn4qm"] Dec 09 04:00:04 crc kubenswrapper[4766]: I1209 04:00:04.849761 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c71c43-36b4-4147-b4c6-5e524c156343" path="/var/lib/kubelet/pods/35c71c43-36b4-4147-b4c6-5e524c156343/volumes" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.506133 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2zzh7"] Dec 09 04:00:05 crc kubenswrapper[4766]: E1209 04:00:05.506656 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9" containerName="collect-profiles" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.506672 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9" containerName="collect-profiles" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.506832 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9" containerName="collect-profiles" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.507928 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.519391 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2zzh7"] Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.570766 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsz2h\" (UniqueName: \"kubernetes.io/projected/dd35cb98-0d90-493c-b148-89bf8c6222f6-kube-api-access-zsz2h\") pod \"certified-operators-2zzh7\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.570909 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-utilities\") pod \"certified-operators-2zzh7\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.571118 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-catalog-content\") pod \"certified-operators-2zzh7\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.672567 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-catalog-content\") pod \"certified-operators-2zzh7\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.672655 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsz2h\" (UniqueName: \"kubernetes.io/projected/dd35cb98-0d90-493c-b148-89bf8c6222f6-kube-api-access-zsz2h\") pod \"certified-operators-2zzh7\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.672694 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-utilities\") pod \"certified-operators-2zzh7\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.673125 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-catalog-content\") pod \"certified-operators-2zzh7\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.673142 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-utilities\") pod \"certified-operators-2zzh7\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.692139 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsz2h\" (UniqueName: \"kubernetes.io/projected/dd35cb98-0d90-493c-b148-89bf8c6222f6-kube-api-access-zsz2h\") pod \"certified-operators-2zzh7\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:05 crc kubenswrapper[4766]: I1209 04:00:05.826634 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:06 crc kubenswrapper[4766]: I1209 04:00:06.081341 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2zzh7"] Dec 09 04:00:06 crc kubenswrapper[4766]: I1209 04:00:06.986838 4766 generic.go:334] "Generic (PLEG): container finished" podID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerID="18a38ccb90df809f592d6b1e1dc586b97c072c9e688318932ce0de31dc48bc85" exitCode=0 Dec 09 04:00:06 crc kubenswrapper[4766]: I1209 04:00:06.986891 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zzh7" event={"ID":"dd35cb98-0d90-493c-b148-89bf8c6222f6","Type":"ContainerDied","Data":"18a38ccb90df809f592d6b1e1dc586b97c072c9e688318932ce0de31dc48bc85"} Dec 09 04:00:06 crc kubenswrapper[4766]: I1209 04:00:06.988884 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zzh7" event={"ID":"dd35cb98-0d90-493c-b148-89bf8c6222f6","Type":"ContainerStarted","Data":"570bba67080788c96b314ad7d03cd4320caccd665c52bd6166eaccef4d95d5d1"} Dec 09 04:00:07 crc kubenswrapper[4766]: I1209 04:00:07.316776 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:00:07 crc kubenswrapper[4766]: I1209 04:00:07.316848 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:00:07 crc kubenswrapper[4766]: I1209 04:00:07.316907 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:00:07 crc kubenswrapper[4766]: I1209 04:00:07.317781 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:00:07 crc kubenswrapper[4766]: I1209 04:00:07.317876 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" gracePeriod=600 Dec 09 04:00:07 crc kubenswrapper[4766]: E1209 04:00:07.452700 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:00:07 crc kubenswrapper[4766]: I1209 04:00:07.998649 4766 generic.go:334] "Generic (PLEG): container finished" podID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerID="fd5ef444e6b278cc48a6f555cacef9024996ffefaf47d3dc55a1aa9554b5bb6f" exitCode=0 Dec 09 04:00:08 crc kubenswrapper[4766]: I1209 04:00:07.998728 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zzh7" event={"ID":"dd35cb98-0d90-493c-b148-89bf8c6222f6","Type":"ContainerDied","Data":"fd5ef444e6b278cc48a6f555cacef9024996ffefaf47d3dc55a1aa9554b5bb6f"} Dec 09 04:00:08 crc kubenswrapper[4766]: I1209 04:00:08.001693 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" exitCode=0 Dec 09 04:00:08 crc kubenswrapper[4766]: I1209 04:00:08.001746 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426"} Dec 09 04:00:08 crc kubenswrapper[4766]: I1209 04:00:08.001813 4766 scope.go:117] "RemoveContainer" containerID="08aaab589a25b1eb0b46b67b3f7484b5d02189326221ce164d8ab2c12ae184eb" Dec 09 04:00:08 crc kubenswrapper[4766]: I1209 04:00:08.002265 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:00:08 crc kubenswrapper[4766]: E1209 04:00:08.002632 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:00:09 crc kubenswrapper[4766]: I1209 04:00:09.012385 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zzh7" event={"ID":"dd35cb98-0d90-493c-b148-89bf8c6222f6","Type":"ContainerStarted","Data":"afdc0737787e6b018b232dd2802b915fde2bc9a5bc8b3a715653235a8c8909f0"} Dec 09 04:00:09 crc kubenswrapper[4766]: I1209 04:00:09.029933 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2zzh7" podStartSLOduration=2.604156203 podStartE2EDuration="4.029913353s" podCreationTimestamp="2025-12-09 04:00:05 +0000 UTC" firstStartedPulling="2025-12-09 04:00:06.988707287 +0000 UTC m=+2888.698012713" lastFinishedPulling="2025-12-09 04:00:08.414464437 +0000 UTC m=+2890.123769863" observedRunningTime="2025-12-09 04:00:09.028758611 +0000 UTC m=+2890.738064047" watchObservedRunningTime="2025-12-09 04:00:09.029913353 +0000 UTC m=+2890.739218779" Dec 09 04:00:12 crc kubenswrapper[4766]: I1209 04:00:12.046652 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5pjsr" event={"ID":"36e8a294-0810-4bf5-99eb-72ef3bf74932","Type":"ContainerStarted","Data":"e5449a9ee57c661ac20cda8cf6c9c026ac68dedc8ae3305087be1f24fa60d7db"} Dec 09 04:00:12 crc kubenswrapper[4766]: I1209 04:00:12.504374 4766 scope.go:117] "RemoveContainer" containerID="c3498cd8dcee4146efe99589ef7392db4990be9443acc8daa3a5909e8d9de0d4" Dec 09 04:00:13 crc kubenswrapper[4766]: I1209 04:00:13.057771 4766 generic.go:334] "Generic (PLEG): container finished" podID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerID="e5449a9ee57c661ac20cda8cf6c9c026ac68dedc8ae3305087be1f24fa60d7db" exitCode=0 Dec 09 04:00:13 crc kubenswrapper[4766]: I1209 04:00:13.057818 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5pjsr" event={"ID":"36e8a294-0810-4bf5-99eb-72ef3bf74932","Type":"ContainerDied","Data":"e5449a9ee57c661ac20cda8cf6c9c026ac68dedc8ae3305087be1f24fa60d7db"} Dec 09 04:00:14 crc kubenswrapper[4766]: I1209 04:00:14.067616 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5pjsr" event={"ID":"36e8a294-0810-4bf5-99eb-72ef3bf74932","Type":"ContainerStarted","Data":"745f4faeca3aedcbc27272615efc0bc021c3dc6d2bece5c9b4cd970211268f1b"} Dec 09 04:00:14 crc kubenswrapper[4766]: I1209 04:00:14.089355 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5pjsr" podStartSLOduration=2.254026945 podStartE2EDuration="15.089338592s" podCreationTimestamp="2025-12-09 03:59:59 +0000 UTC" firstStartedPulling="2025-12-09 04:00:00.937654472 +0000 UTC m=+2882.646959898" lastFinishedPulling="2025-12-09 04:00:13.772966099 +0000 UTC m=+2895.482271545" observedRunningTime="2025-12-09 04:00:14.084649815 +0000 UTC m=+2895.793955281" watchObservedRunningTime="2025-12-09 04:00:14.089338592 +0000 UTC m=+2895.798644018" Dec 09 04:00:15 crc kubenswrapper[4766]: I1209 04:00:15.829037 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:15 crc kubenswrapper[4766]: I1209 04:00:15.829074 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:15 crc kubenswrapper[4766]: I1209 04:00:15.893013 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:16 crc kubenswrapper[4766]: I1209 04:00:16.134191 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:16 crc kubenswrapper[4766]: I1209 04:00:16.322248 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2zzh7"] Dec 09 04:00:18 crc kubenswrapper[4766]: I1209 04:00:18.098729 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2zzh7" podUID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerName="registry-server" containerID="cri-o://afdc0737787e6b018b232dd2802b915fde2bc9a5bc8b3a715653235a8c8909f0" gracePeriod=2 Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.109360 4766 generic.go:334] "Generic (PLEG): container finished" podID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerID="afdc0737787e6b018b232dd2802b915fde2bc9a5bc8b3a715653235a8c8909f0" exitCode=0 Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.109448 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zzh7" event={"ID":"dd35cb98-0d90-493c-b148-89bf8c6222f6","Type":"ContainerDied","Data":"afdc0737787e6b018b232dd2802b915fde2bc9a5bc8b3a715653235a8c8909f0"} Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.607443 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.643947 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.643993 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.681731 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsz2h\" (UniqueName: \"kubernetes.io/projected/dd35cb98-0d90-493c-b148-89bf8c6222f6-kube-api-access-zsz2h\") pod \"dd35cb98-0d90-493c-b148-89bf8c6222f6\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.682116 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-catalog-content\") pod \"dd35cb98-0d90-493c-b148-89bf8c6222f6\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.682189 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-utilities\") pod \"dd35cb98-0d90-493c-b148-89bf8c6222f6\" (UID: \"dd35cb98-0d90-493c-b148-89bf8c6222f6\") " Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.685838 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-utilities" (OuterVolumeSpecName: "utilities") pod "dd35cb98-0d90-493c-b148-89bf8c6222f6" (UID: "dd35cb98-0d90-493c-b148-89bf8c6222f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.692276 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd35cb98-0d90-493c-b148-89bf8c6222f6-kube-api-access-zsz2h" (OuterVolumeSpecName: "kube-api-access-zsz2h") pod "dd35cb98-0d90-493c-b148-89bf8c6222f6" (UID: "dd35cb98-0d90-493c-b148-89bf8c6222f6"). InnerVolumeSpecName "kube-api-access-zsz2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.693407 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.737933 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd35cb98-0d90-493c-b148-89bf8c6222f6" (UID: "dd35cb98-0d90-493c-b148-89bf8c6222f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.784679 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.784956 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsz2h\" (UniqueName: \"kubernetes.io/projected/dd35cb98-0d90-493c-b148-89bf8c6222f6-kube-api-access-zsz2h\") on node \"crc\" DevicePath \"\"" Dec 09 04:00:19 crc kubenswrapper[4766]: I1209 04:00:19.785049 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd35cb98-0d90-493c-b148-89bf8c6222f6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.123374 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zzh7" Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.123366 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zzh7" event={"ID":"dd35cb98-0d90-493c-b148-89bf8c6222f6","Type":"ContainerDied","Data":"570bba67080788c96b314ad7d03cd4320caccd665c52bd6166eaccef4d95d5d1"} Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.123462 4766 scope.go:117] "RemoveContainer" containerID="afdc0737787e6b018b232dd2802b915fde2bc9a5bc8b3a715653235a8c8909f0" Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.150495 4766 scope.go:117] "RemoveContainer" containerID="fd5ef444e6b278cc48a6f555cacef9024996ffefaf47d3dc55a1aa9554b5bb6f" Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.187996 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2zzh7"] Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.189350 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.196755 4766 scope.go:117] "RemoveContainer" containerID="18a38ccb90df809f592d6b1e1dc586b97c072c9e688318932ce0de31dc48bc85" Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.200565 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2zzh7"] Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.731990 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5pjsr"] Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.839164 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:00:20 crc kubenswrapper[4766]: E1209 04:00:20.839455 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:00:20 crc kubenswrapper[4766]: I1209 04:00:20.851560 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd35cb98-0d90-493c-b148-89bf8c6222f6" path="/var/lib/kubelet/pods/dd35cb98-0d90-493c-b148-89bf8c6222f6/volumes" Dec 09 04:00:22 crc kubenswrapper[4766]: I1209 04:00:22.144042 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5pjsr" podUID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerName="registry-server" containerID="cri-o://745f4faeca3aedcbc27272615efc0bc021c3dc6d2bece5c9b4cd970211268f1b" gracePeriod=2 Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.160574 4766 generic.go:334] "Generic (PLEG): container finished" podID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerID="745f4faeca3aedcbc27272615efc0bc021c3dc6d2bece5c9b4cd970211268f1b" exitCode=0 Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.160684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5pjsr" event={"ID":"36e8a294-0810-4bf5-99eb-72ef3bf74932","Type":"ContainerDied","Data":"745f4faeca3aedcbc27272615efc0bc021c3dc6d2bece5c9b4cd970211268f1b"} Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.365839 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.457513 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl7kp\" (UniqueName: \"kubernetes.io/projected/36e8a294-0810-4bf5-99eb-72ef3bf74932-kube-api-access-pl7kp\") pod \"36e8a294-0810-4bf5-99eb-72ef3bf74932\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.457594 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-catalog-content\") pod \"36e8a294-0810-4bf5-99eb-72ef3bf74932\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.457703 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-utilities\") pod \"36e8a294-0810-4bf5-99eb-72ef3bf74932\" (UID: \"36e8a294-0810-4bf5-99eb-72ef3bf74932\") " Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.458889 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-utilities" (OuterVolumeSpecName: "utilities") pod "36e8a294-0810-4bf5-99eb-72ef3bf74932" (UID: "36e8a294-0810-4bf5-99eb-72ef3bf74932"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.463105 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36e8a294-0810-4bf5-99eb-72ef3bf74932-kube-api-access-pl7kp" (OuterVolumeSpecName: "kube-api-access-pl7kp") pod "36e8a294-0810-4bf5-99eb-72ef3bf74932" (UID: "36e8a294-0810-4bf5-99eb-72ef3bf74932"). InnerVolumeSpecName "kube-api-access-pl7kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.559596 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl7kp\" (UniqueName: \"kubernetes.io/projected/36e8a294-0810-4bf5-99eb-72ef3bf74932-kube-api-access-pl7kp\") on node \"crc\" DevicePath \"\"" Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.559633 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.575427 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36e8a294-0810-4bf5-99eb-72ef3bf74932" (UID: "36e8a294-0810-4bf5-99eb-72ef3bf74932"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:00:24 crc kubenswrapper[4766]: I1209 04:00:24.661244 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36e8a294-0810-4bf5-99eb-72ef3bf74932-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:00:25 crc kubenswrapper[4766]: I1209 04:00:25.169489 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5pjsr" event={"ID":"36e8a294-0810-4bf5-99eb-72ef3bf74932","Type":"ContainerDied","Data":"47c01f4d010809a376afc38a08d3fbe93a4f4f21aba70814648b5a674e91957e"} Dec 09 04:00:25 crc kubenswrapper[4766]: I1209 04:00:25.169845 4766 scope.go:117] "RemoveContainer" containerID="745f4faeca3aedcbc27272615efc0bc021c3dc6d2bece5c9b4cd970211268f1b" Dec 09 04:00:25 crc kubenswrapper[4766]: I1209 04:00:25.169591 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5pjsr" Dec 09 04:00:25 crc kubenswrapper[4766]: I1209 04:00:25.191454 4766 scope.go:117] "RemoveContainer" containerID="e5449a9ee57c661ac20cda8cf6c9c026ac68dedc8ae3305087be1f24fa60d7db" Dec 09 04:00:25 crc kubenswrapper[4766]: I1209 04:00:25.194028 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5pjsr"] Dec 09 04:00:25 crc kubenswrapper[4766]: I1209 04:00:25.201163 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5pjsr"] Dec 09 04:00:25 crc kubenswrapper[4766]: I1209 04:00:25.210409 4766 scope.go:117] "RemoveContainer" containerID="db2ba3b370f2925bd575d8e7f764f7d3d9a3beaab1091b0e9bb7b8cec6024ae7" Dec 09 04:00:26 crc kubenswrapper[4766]: I1209 04:00:26.850269 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36e8a294-0810-4bf5-99eb-72ef3bf74932" path="/var/lib/kubelet/pods/36e8a294-0810-4bf5-99eb-72ef3bf74932/volumes" Dec 09 04:00:33 crc kubenswrapper[4766]: I1209 04:00:33.840357 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:00:33 crc kubenswrapper[4766]: E1209 04:00:33.841410 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:00:44 crc kubenswrapper[4766]: I1209 04:00:44.838933 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:00:44 crc kubenswrapper[4766]: E1209 04:00:44.839663 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:00:59 crc kubenswrapper[4766]: I1209 04:00:59.839651 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:00:59 crc kubenswrapper[4766]: E1209 04:00:59.840346 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:01:11 crc kubenswrapper[4766]: I1209 04:01:11.839292 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:01:11 crc kubenswrapper[4766]: E1209 04:01:11.840106 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:01:26 crc kubenswrapper[4766]: I1209 04:01:26.839395 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:01:26 crc kubenswrapper[4766]: E1209 04:01:26.840047 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:01:38 crc kubenswrapper[4766]: I1209 04:01:38.853167 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:01:38 crc kubenswrapper[4766]: E1209 04:01:38.854202 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:01:53 crc kubenswrapper[4766]: I1209 04:01:53.840146 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:01:53 crc kubenswrapper[4766]: E1209 04:01:53.841199 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:02:07 crc kubenswrapper[4766]: I1209 04:02:07.839048 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:02:07 crc kubenswrapper[4766]: E1209 04:02:07.840054 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:02:19 crc kubenswrapper[4766]: I1209 04:02:19.839721 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:02:19 crc kubenswrapper[4766]: E1209 04:02:19.840932 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:02:31 crc kubenswrapper[4766]: I1209 04:02:31.840348 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:02:31 crc kubenswrapper[4766]: E1209 04:02:31.841359 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:02:45 crc kubenswrapper[4766]: I1209 04:02:45.840036 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:02:45 crc kubenswrapper[4766]: E1209 04:02:45.841130 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:03:00 crc kubenswrapper[4766]: I1209 04:03:00.840169 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:03:00 crc kubenswrapper[4766]: E1209 04:03:00.841297 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:03:14 crc kubenswrapper[4766]: I1209 04:03:14.839095 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:03:14 crc kubenswrapper[4766]: E1209 04:03:14.839847 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:03:26 crc kubenswrapper[4766]: I1209 04:03:26.839444 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:03:26 crc kubenswrapper[4766]: E1209 04:03:26.840191 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:03:37 crc kubenswrapper[4766]: I1209 04:03:37.839618 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:03:37 crc kubenswrapper[4766]: E1209 04:03:37.840677 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:03:52 crc kubenswrapper[4766]: I1209 04:03:52.839740 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:03:52 crc kubenswrapper[4766]: E1209 04:03:52.840776 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:04:06 crc kubenswrapper[4766]: I1209 04:04:06.840299 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:04:06 crc kubenswrapper[4766]: E1209 04:04:06.841600 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.264996 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wkm2c"] Dec 09 04:04:11 crc kubenswrapper[4766]: E1209 04:04:11.265853 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerName="extract-content" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.265877 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerName="extract-content" Dec 09 04:04:11 crc kubenswrapper[4766]: E1209 04:04:11.265904 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerName="extract-content" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.265918 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerName="extract-content" Dec 09 04:04:11 crc kubenswrapper[4766]: E1209 04:04:11.265940 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerName="registry-server" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.265954 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerName="registry-server" Dec 09 04:04:11 crc kubenswrapper[4766]: E1209 04:04:11.266007 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerName="extract-utilities" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.266020 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerName="extract-utilities" Dec 09 04:04:11 crc kubenswrapper[4766]: E1209 04:04:11.266040 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerName="extract-utilities" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.266053 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerName="extract-utilities" Dec 09 04:04:11 crc kubenswrapper[4766]: E1209 04:04:11.266077 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerName="registry-server" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.266090 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerName="registry-server" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.266420 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="36e8a294-0810-4bf5-99eb-72ef3bf74932" containerName="registry-server" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.266459 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd35cb98-0d90-493c-b148-89bf8c6222f6" containerName="registry-server" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.268473 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.289127 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wkm2c"] Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.438089 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-catalog-content\") pod \"community-operators-wkm2c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.438230 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4rg7\" (UniqueName: \"kubernetes.io/projected/9b22843d-f6ea-4131-85a2-12b7a802db6c-kube-api-access-c4rg7\") pod \"community-operators-wkm2c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.438273 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-utilities\") pod \"community-operators-wkm2c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.540204 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-catalog-content\") pod \"community-operators-wkm2c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.540370 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4rg7\" (UniqueName: \"kubernetes.io/projected/9b22843d-f6ea-4131-85a2-12b7a802db6c-kube-api-access-c4rg7\") pod \"community-operators-wkm2c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.540407 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-utilities\") pod \"community-operators-wkm2c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.540807 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-catalog-content\") pod \"community-operators-wkm2c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.540922 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-utilities\") pod \"community-operators-wkm2c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.560618 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4rg7\" (UniqueName: \"kubernetes.io/projected/9b22843d-f6ea-4131-85a2-12b7a802db6c-kube-api-access-c4rg7\") pod \"community-operators-wkm2c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:11 crc kubenswrapper[4766]: I1209 04:04:11.595936 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:12 crc kubenswrapper[4766]: I1209 04:04:12.073432 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wkm2c"] Dec 09 04:04:12 crc kubenswrapper[4766]: I1209 04:04:12.366788 4766 generic.go:334] "Generic (PLEG): container finished" podID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerID="5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025" exitCode=0 Dec 09 04:04:12 crc kubenswrapper[4766]: I1209 04:04:12.366837 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkm2c" event={"ID":"9b22843d-f6ea-4131-85a2-12b7a802db6c","Type":"ContainerDied","Data":"5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025"} Dec 09 04:04:12 crc kubenswrapper[4766]: I1209 04:04:12.366867 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkm2c" event={"ID":"9b22843d-f6ea-4131-85a2-12b7a802db6c","Type":"ContainerStarted","Data":"4e69112bf89039c4b2d61388b8c353dc81f445355faab8cc91c0db36021ee685"} Dec 09 04:04:13 crc kubenswrapper[4766]: I1209 04:04:13.375794 4766 generic.go:334] "Generic (PLEG): container finished" podID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerID="23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a" exitCode=0 Dec 09 04:04:13 crc kubenswrapper[4766]: I1209 04:04:13.375910 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkm2c" event={"ID":"9b22843d-f6ea-4131-85a2-12b7a802db6c","Type":"ContainerDied","Data":"23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a"} Dec 09 04:04:14 crc kubenswrapper[4766]: I1209 04:04:14.385024 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkm2c" event={"ID":"9b22843d-f6ea-4131-85a2-12b7a802db6c","Type":"ContainerStarted","Data":"3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c"} Dec 09 04:04:14 crc kubenswrapper[4766]: I1209 04:04:14.408165 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wkm2c" podStartSLOduration=2.038828884 podStartE2EDuration="3.408145821s" podCreationTimestamp="2025-12-09 04:04:11 +0000 UTC" firstStartedPulling="2025-12-09 04:04:12.368509329 +0000 UTC m=+3134.077814755" lastFinishedPulling="2025-12-09 04:04:13.737826266 +0000 UTC m=+3135.447131692" observedRunningTime="2025-12-09 04:04:14.401897903 +0000 UTC m=+3136.111203349" watchObservedRunningTime="2025-12-09 04:04:14.408145821 +0000 UTC m=+3136.117451247" Dec 09 04:04:18 crc kubenswrapper[4766]: I1209 04:04:18.847369 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:04:18 crc kubenswrapper[4766]: E1209 04:04:18.847938 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:04:21 crc kubenswrapper[4766]: I1209 04:04:21.596944 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:21 crc kubenswrapper[4766]: I1209 04:04:21.597497 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:21 crc kubenswrapper[4766]: I1209 04:04:21.668679 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:22 crc kubenswrapper[4766]: I1209 04:04:22.520835 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:22 crc kubenswrapper[4766]: I1209 04:04:22.583119 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wkm2c"] Dec 09 04:04:24 crc kubenswrapper[4766]: I1209 04:04:24.467190 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wkm2c" podUID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerName="registry-server" containerID="cri-o://3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c" gracePeriod=2 Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.422119 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.476955 4766 generic.go:334] "Generic (PLEG): container finished" podID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerID="3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c" exitCode=0 Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.477002 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkm2c" event={"ID":"9b22843d-f6ea-4131-85a2-12b7a802db6c","Type":"ContainerDied","Data":"3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c"} Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.477028 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkm2c" event={"ID":"9b22843d-f6ea-4131-85a2-12b7a802db6c","Type":"ContainerDied","Data":"4e69112bf89039c4b2d61388b8c353dc81f445355faab8cc91c0db36021ee685"} Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.477058 4766 scope.go:117] "RemoveContainer" containerID="3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.477077 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkm2c" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.498935 4766 scope.go:117] "RemoveContainer" containerID="23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.522376 4766 scope.go:117] "RemoveContainer" containerID="5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.533054 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-utilities\") pod \"9b22843d-f6ea-4131-85a2-12b7a802db6c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.533127 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4rg7\" (UniqueName: \"kubernetes.io/projected/9b22843d-f6ea-4131-85a2-12b7a802db6c-kube-api-access-c4rg7\") pod \"9b22843d-f6ea-4131-85a2-12b7a802db6c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.533168 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-catalog-content\") pod \"9b22843d-f6ea-4131-85a2-12b7a802db6c\" (UID: \"9b22843d-f6ea-4131-85a2-12b7a802db6c\") " Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.534963 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-utilities" (OuterVolumeSpecName: "utilities") pod "9b22843d-f6ea-4131-85a2-12b7a802db6c" (UID: "9b22843d-f6ea-4131-85a2-12b7a802db6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.535140 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.543192 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b22843d-f6ea-4131-85a2-12b7a802db6c-kube-api-access-c4rg7" (OuterVolumeSpecName: "kube-api-access-c4rg7") pod "9b22843d-f6ea-4131-85a2-12b7a802db6c" (UID: "9b22843d-f6ea-4131-85a2-12b7a802db6c"). InnerVolumeSpecName "kube-api-access-c4rg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.548576 4766 scope.go:117] "RemoveContainer" containerID="3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c" Dec 09 04:04:25 crc kubenswrapper[4766]: E1209 04:04:25.549261 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c\": container with ID starting with 3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c not found: ID does not exist" containerID="3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.549305 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c"} err="failed to get container status \"3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c\": rpc error: code = NotFound desc = could not find container \"3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c\": container with ID starting with 3ce7c4c3db355e945c5c44c8280b274b394b6429e0acd1a59733fa9466b37b9c not found: ID does not exist" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.549346 4766 scope.go:117] "RemoveContainer" containerID="23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a" Dec 09 04:04:25 crc kubenswrapper[4766]: E1209 04:04:25.549727 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a\": container with ID starting with 23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a not found: ID does not exist" containerID="23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.549763 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a"} err="failed to get container status \"23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a\": rpc error: code = NotFound desc = could not find container \"23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a\": container with ID starting with 23ebecc25037525f36195a92e963b6ca75569b3f0523837bfdbaf5115aaace5a not found: ID does not exist" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.549785 4766 scope.go:117] "RemoveContainer" containerID="5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025" Dec 09 04:04:25 crc kubenswrapper[4766]: E1209 04:04:25.550375 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025\": container with ID starting with 5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025 not found: ID does not exist" containerID="5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.550406 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025"} err="failed to get container status \"5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025\": rpc error: code = NotFound desc = could not find container \"5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025\": container with ID starting with 5383f545e0c3b9127e145d0390e574d01801872a4d3ab0fe5636abb24901f025 not found: ID does not exist" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.595093 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b22843d-f6ea-4131-85a2-12b7a802db6c" (UID: "9b22843d-f6ea-4131-85a2-12b7a802db6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.635602 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4rg7\" (UniqueName: \"kubernetes.io/projected/9b22843d-f6ea-4131-85a2-12b7a802db6c-kube-api-access-c4rg7\") on node \"crc\" DevicePath \"\"" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.635651 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b22843d-f6ea-4131-85a2-12b7a802db6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.817407 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wkm2c"] Dec 09 04:04:25 crc kubenswrapper[4766]: I1209 04:04:25.822645 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wkm2c"] Dec 09 04:04:26 crc kubenswrapper[4766]: I1209 04:04:26.858432 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b22843d-f6ea-4131-85a2-12b7a802db6c" path="/var/lib/kubelet/pods/9b22843d-f6ea-4131-85a2-12b7a802db6c/volumes" Dec 09 04:04:32 crc kubenswrapper[4766]: I1209 04:04:32.838703 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:04:32 crc kubenswrapper[4766]: E1209 04:04:32.839357 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:04:46 crc kubenswrapper[4766]: I1209 04:04:46.839972 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:04:46 crc kubenswrapper[4766]: E1209 04:04:46.840839 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:04:57 crc kubenswrapper[4766]: I1209 04:04:57.839745 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:04:57 crc kubenswrapper[4766]: E1209 04:04:57.840939 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:05:11 crc kubenswrapper[4766]: I1209 04:05:11.839352 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:05:12 crc kubenswrapper[4766]: I1209 04:05:12.861943 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"2afcfe56ec4967a39d86ceb95400a94b587480b4dae0213812f416046bd6d7ea"} Dec 09 04:07:37 crc kubenswrapper[4766]: I1209 04:07:37.316179 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:07:37 crc kubenswrapper[4766]: I1209 04:07:37.316944 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:08:07 crc kubenswrapper[4766]: I1209 04:08:07.316797 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:08:07 crc kubenswrapper[4766]: I1209 04:08:07.317765 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:08:37 crc kubenswrapper[4766]: I1209 04:08:37.316624 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:08:37 crc kubenswrapper[4766]: I1209 04:08:37.317097 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:08:37 crc kubenswrapper[4766]: I1209 04:08:37.317148 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:08:37 crc kubenswrapper[4766]: I1209 04:08:37.317769 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2afcfe56ec4967a39d86ceb95400a94b587480b4dae0213812f416046bd6d7ea"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:08:37 crc kubenswrapper[4766]: I1209 04:08:37.317839 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://2afcfe56ec4967a39d86ceb95400a94b587480b4dae0213812f416046bd6d7ea" gracePeriod=600 Dec 09 04:08:37 crc kubenswrapper[4766]: I1209 04:08:37.636817 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="2afcfe56ec4967a39d86ceb95400a94b587480b4dae0213812f416046bd6d7ea" exitCode=0 Dec 09 04:08:37 crc kubenswrapper[4766]: I1209 04:08:37.637265 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"2afcfe56ec4967a39d86ceb95400a94b587480b4dae0213812f416046bd6d7ea"} Dec 09 04:08:37 crc kubenswrapper[4766]: I1209 04:08:37.637332 4766 scope.go:117] "RemoveContainer" containerID="05cd6b5ce26b4efa5a1ba93302b81afe95b8a6ed13f5b981e0892064d01e4426" Dec 09 04:08:38 crc kubenswrapper[4766]: I1209 04:08:38.650048 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d"} Dec 09 04:10:37 crc kubenswrapper[4766]: I1209 04:10:37.316669 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:10:37 crc kubenswrapper[4766]: I1209 04:10:37.317273 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:11:07 crc kubenswrapper[4766]: I1209 04:11:07.316649 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:11:07 crc kubenswrapper[4766]: I1209 04:11:07.317103 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.002014 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5kv77"] Dec 09 04:11:21 crc kubenswrapper[4766]: E1209 04:11:21.003165 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerName="extract-utilities" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.003182 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerName="extract-utilities" Dec 09 04:11:21 crc kubenswrapper[4766]: E1209 04:11:21.003231 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerName="extract-content" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.003240 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerName="extract-content" Dec 09 04:11:21 crc kubenswrapper[4766]: E1209 04:11:21.003255 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerName="registry-server" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.003263 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerName="registry-server" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.003457 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b22843d-f6ea-4131-85a2-12b7a802db6c" containerName="registry-server" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.004794 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.021905 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5kv77"] Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.104720 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjfvt\" (UniqueName: \"kubernetes.io/projected/171fcfd0-caf1-4bbd-9298-c920c0c61de8-kube-api-access-xjfvt\") pod \"certified-operators-5kv77\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.104818 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-utilities\") pod \"certified-operators-5kv77\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.104858 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-catalog-content\") pod \"certified-operators-5kv77\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.206258 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjfvt\" (UniqueName: \"kubernetes.io/projected/171fcfd0-caf1-4bbd-9298-c920c0c61de8-kube-api-access-xjfvt\") pod \"certified-operators-5kv77\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.206322 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-utilities\") pod \"certified-operators-5kv77\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.206347 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-catalog-content\") pod \"certified-operators-5kv77\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.206793 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-catalog-content\") pod \"certified-operators-5kv77\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.206931 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-utilities\") pod \"certified-operators-5kv77\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.228494 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjfvt\" (UniqueName: \"kubernetes.io/projected/171fcfd0-caf1-4bbd-9298-c920c0c61de8-kube-api-access-xjfvt\") pod \"certified-operators-5kv77\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.342141 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.830967 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5kv77"] Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.997109 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kv77" event={"ID":"171fcfd0-caf1-4bbd-9298-c920c0c61de8","Type":"ContainerStarted","Data":"b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43"} Dec 09 04:11:21 crc kubenswrapper[4766]: I1209 04:11:21.997154 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kv77" event={"ID":"171fcfd0-caf1-4bbd-9298-c920c0c61de8","Type":"ContainerStarted","Data":"4f189451a317d6f2972ab75c14930d23ec029feaf9f32e5b3b4d2cab8da0d987"} Dec 09 04:11:23 crc kubenswrapper[4766]: I1209 04:11:23.005597 4766 generic.go:334] "Generic (PLEG): container finished" podID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerID="b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43" exitCode=0 Dec 09 04:11:23 crc kubenswrapper[4766]: I1209 04:11:23.005675 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kv77" event={"ID":"171fcfd0-caf1-4bbd-9298-c920c0c61de8","Type":"ContainerDied","Data":"b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43"} Dec 09 04:11:23 crc kubenswrapper[4766]: I1209 04:11:23.007815 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 04:11:27 crc kubenswrapper[4766]: I1209 04:11:27.036739 4766 generic.go:334] "Generic (PLEG): container finished" podID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerID="61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444" exitCode=0 Dec 09 04:11:27 crc kubenswrapper[4766]: I1209 04:11:27.036829 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kv77" event={"ID":"171fcfd0-caf1-4bbd-9298-c920c0c61de8","Type":"ContainerDied","Data":"61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444"} Dec 09 04:11:28 crc kubenswrapper[4766]: I1209 04:11:28.049032 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kv77" event={"ID":"171fcfd0-caf1-4bbd-9298-c920c0c61de8","Type":"ContainerStarted","Data":"63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f"} Dec 09 04:11:28 crc kubenswrapper[4766]: I1209 04:11:28.069553 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5kv77" podStartSLOduration=3.636820214 podStartE2EDuration="8.069531036s" podCreationTimestamp="2025-12-09 04:11:20 +0000 UTC" firstStartedPulling="2025-12-09 04:11:23.007589 +0000 UTC m=+3564.716894426" lastFinishedPulling="2025-12-09 04:11:27.440299822 +0000 UTC m=+3569.149605248" observedRunningTime="2025-12-09 04:11:28.066104324 +0000 UTC m=+3569.775409780" watchObservedRunningTime="2025-12-09 04:11:28.069531036 +0000 UTC m=+3569.778836462" Dec 09 04:11:31 crc kubenswrapper[4766]: I1209 04:11:31.343066 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:31 crc kubenswrapper[4766]: I1209 04:11:31.343561 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:31 crc kubenswrapper[4766]: I1209 04:11:31.387147 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:32 crc kubenswrapper[4766]: I1209 04:11:32.130224 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:11:32 crc kubenswrapper[4766]: I1209 04:11:32.210578 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5kv77"] Dec 09 04:11:32 crc kubenswrapper[4766]: I1209 04:11:32.250568 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtw8c"] Dec 09 04:11:32 crc kubenswrapper[4766]: I1209 04:11:32.250868 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dtw8c" podUID="29393907-e416-489a-a2dc-92c75c0284bc" containerName="registry-server" containerID="cri-o://867fa31cf6551b176bf9cc80b422b475f6916b09bd067f96cb1b571d54fbfaae" gracePeriod=2 Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.087101 4766 generic.go:334] "Generic (PLEG): container finished" podID="29393907-e416-489a-a2dc-92c75c0284bc" containerID="867fa31cf6551b176bf9cc80b422b475f6916b09bd067f96cb1b571d54fbfaae" exitCode=0 Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.087162 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtw8c" event={"ID":"29393907-e416-489a-a2dc-92c75c0284bc","Type":"ContainerDied","Data":"867fa31cf6551b176bf9cc80b422b475f6916b09bd067f96cb1b571d54fbfaae"} Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.147486 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.294239 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ccjr\" (UniqueName: \"kubernetes.io/projected/29393907-e416-489a-a2dc-92c75c0284bc-kube-api-access-2ccjr\") pod \"29393907-e416-489a-a2dc-92c75c0284bc\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.294326 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-catalog-content\") pod \"29393907-e416-489a-a2dc-92c75c0284bc\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.294394 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-utilities\") pod \"29393907-e416-489a-a2dc-92c75c0284bc\" (UID: \"29393907-e416-489a-a2dc-92c75c0284bc\") " Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.295068 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-utilities" (OuterVolumeSpecName: "utilities") pod "29393907-e416-489a-a2dc-92c75c0284bc" (UID: "29393907-e416-489a-a2dc-92c75c0284bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.299869 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29393907-e416-489a-a2dc-92c75c0284bc-kube-api-access-2ccjr" (OuterVolumeSpecName: "kube-api-access-2ccjr") pod "29393907-e416-489a-a2dc-92c75c0284bc" (UID: "29393907-e416-489a-a2dc-92c75c0284bc"). InnerVolumeSpecName "kube-api-access-2ccjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.345793 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29393907-e416-489a-a2dc-92c75c0284bc" (UID: "29393907-e416-489a-a2dc-92c75c0284bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.396014 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.396063 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ccjr\" (UniqueName: \"kubernetes.io/projected/29393907-e416-489a-a2dc-92c75c0284bc-kube-api-access-2ccjr\") on node \"crc\" DevicePath \"\"" Dec 09 04:11:33 crc kubenswrapper[4766]: I1209 04:11:33.396075 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29393907-e416-489a-a2dc-92c75c0284bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:11:34 crc kubenswrapper[4766]: I1209 04:11:34.098514 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtw8c" event={"ID":"29393907-e416-489a-a2dc-92c75c0284bc","Type":"ContainerDied","Data":"442770a7a89a738f28cd82445385befbc675dc6d7043ca14fb93d536f8b01617"} Dec 09 04:11:34 crc kubenswrapper[4766]: I1209 04:11:34.098814 4766 scope.go:117] "RemoveContainer" containerID="867fa31cf6551b176bf9cc80b422b475f6916b09bd067f96cb1b571d54fbfaae" Dec 09 04:11:34 crc kubenswrapper[4766]: I1209 04:11:34.098574 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtw8c" Dec 09 04:11:34 crc kubenswrapper[4766]: I1209 04:11:34.135489 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtw8c"] Dec 09 04:11:34 crc kubenswrapper[4766]: I1209 04:11:34.137134 4766 scope.go:117] "RemoveContainer" containerID="94dc6cd5e89548729ec64cf815e3a0890f89af2d048150c079f259fd5331f25d" Dec 09 04:11:34 crc kubenswrapper[4766]: I1209 04:11:34.143693 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dtw8c"] Dec 09 04:11:34 crc kubenswrapper[4766]: I1209 04:11:34.153458 4766 scope.go:117] "RemoveContainer" containerID="a0e3a172c4a5a365bb6d1d7ad57a35d73a3bc9eee9bdb30ed17b28188086138e" Dec 09 04:11:34 crc kubenswrapper[4766]: I1209 04:11:34.849180 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29393907-e416-489a-a2dc-92c75c0284bc" path="/var/lib/kubelet/pods/29393907-e416-489a-a2dc-92c75c0284bc/volumes" Dec 09 04:11:37 crc kubenswrapper[4766]: I1209 04:11:37.316443 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:11:37 crc kubenswrapper[4766]: I1209 04:11:37.316809 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:11:37 crc kubenswrapper[4766]: I1209 04:11:37.316864 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:11:37 crc kubenswrapper[4766]: I1209 04:11:37.317537 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:11:37 crc kubenswrapper[4766]: I1209 04:11:37.317609 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" gracePeriod=600 Dec 09 04:11:38 crc kubenswrapper[4766]: E1209 04:11:38.380951 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42b369b_e4ad_447c_b9b1_5c2461116838.slice/crio-conmon-2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d.scope\": RecentStats: unable to find data in memory cache]" Dec 09 04:11:38 crc kubenswrapper[4766]: E1209 04:11:38.552509 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:11:39 crc kubenswrapper[4766]: I1209 04:11:39.136578 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" exitCode=0 Dec 09 04:11:39 crc kubenswrapper[4766]: I1209 04:11:39.136629 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d"} Dec 09 04:11:39 crc kubenswrapper[4766]: I1209 04:11:39.136669 4766 scope.go:117] "RemoveContainer" containerID="2afcfe56ec4967a39d86ceb95400a94b587480b4dae0213812f416046bd6d7ea" Dec 09 04:11:39 crc kubenswrapper[4766]: I1209 04:11:39.137512 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:11:39 crc kubenswrapper[4766]: E1209 04:11:39.137899 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:11:54 crc kubenswrapper[4766]: I1209 04:11:54.839984 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:11:54 crc kubenswrapper[4766]: E1209 04:11:54.841288 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:12:08 crc kubenswrapper[4766]: I1209 04:12:08.845768 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:12:08 crc kubenswrapper[4766]: E1209 04:12:08.846579 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.511105 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4ss9t"] Dec 09 04:12:15 crc kubenswrapper[4766]: E1209 04:12:15.511961 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29393907-e416-489a-a2dc-92c75c0284bc" containerName="extract-utilities" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.511974 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="29393907-e416-489a-a2dc-92c75c0284bc" containerName="extract-utilities" Dec 09 04:12:15 crc kubenswrapper[4766]: E1209 04:12:15.512001 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29393907-e416-489a-a2dc-92c75c0284bc" containerName="extract-content" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.512008 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="29393907-e416-489a-a2dc-92c75c0284bc" containerName="extract-content" Dec 09 04:12:15 crc kubenswrapper[4766]: E1209 04:12:15.512025 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29393907-e416-489a-a2dc-92c75c0284bc" containerName="registry-server" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.512034 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="29393907-e416-489a-a2dc-92c75c0284bc" containerName="registry-server" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.512227 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="29393907-e416-489a-a2dc-92c75c0284bc" containerName="registry-server" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.513474 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.528199 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4ss9t"] Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.637164 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-utilities\") pod \"redhat-operators-4ss9t\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.637495 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h594h\" (UniqueName: \"kubernetes.io/projected/7bbf373a-2fa5-40fb-860c-5fa0bae15120-kube-api-access-h594h\") pod \"redhat-operators-4ss9t\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.637664 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-catalog-content\") pod \"redhat-operators-4ss9t\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.679548 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vgs2g"] Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.681155 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.697084 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgs2g"] Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.738889 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-utilities\") pod \"redhat-marketplace-vgs2g\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.738938 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-utilities\") pod \"redhat-operators-4ss9t\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.738964 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h594h\" (UniqueName: \"kubernetes.io/projected/7bbf373a-2fa5-40fb-860c-5fa0bae15120-kube-api-access-h594h\") pod \"redhat-operators-4ss9t\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.739001 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-catalog-content\") pod \"redhat-operators-4ss9t\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.739023 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-catalog-content\") pod \"redhat-marketplace-vgs2g\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.739055 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j78kn\" (UniqueName: \"kubernetes.io/projected/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-kube-api-access-j78kn\") pod \"redhat-marketplace-vgs2g\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.739555 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-utilities\") pod \"redhat-operators-4ss9t\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.739754 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-catalog-content\") pod \"redhat-operators-4ss9t\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.772730 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h594h\" (UniqueName: \"kubernetes.io/projected/7bbf373a-2fa5-40fb-860c-5fa0bae15120-kube-api-access-h594h\") pod \"redhat-operators-4ss9t\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.829046 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.840044 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j78kn\" (UniqueName: \"kubernetes.io/projected/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-kube-api-access-j78kn\") pod \"redhat-marketplace-vgs2g\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.840101 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-utilities\") pod \"redhat-marketplace-vgs2g\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.840166 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-catalog-content\") pod \"redhat-marketplace-vgs2g\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.840663 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-catalog-content\") pod \"redhat-marketplace-vgs2g\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.840667 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-utilities\") pod \"redhat-marketplace-vgs2g\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.863206 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j78kn\" (UniqueName: \"kubernetes.io/projected/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-kube-api-access-j78kn\") pod \"redhat-marketplace-vgs2g\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:15 crc kubenswrapper[4766]: I1209 04:12:15.994906 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:16 crc kubenswrapper[4766]: I1209 04:12:16.279597 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4ss9t"] Dec 09 04:12:16 crc kubenswrapper[4766]: I1209 04:12:16.468441 4766 generic.go:334] "Generic (PLEG): container finished" podID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerID="e68fdbf174a784c1836f962f6e482021288e04d95fede6063308fb5d3d294dcd" exitCode=0 Dec 09 04:12:16 crc kubenswrapper[4766]: I1209 04:12:16.468485 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ss9t" event={"ID":"7bbf373a-2fa5-40fb-860c-5fa0bae15120","Type":"ContainerDied","Data":"e68fdbf174a784c1836f962f6e482021288e04d95fede6063308fb5d3d294dcd"} Dec 09 04:12:16 crc kubenswrapper[4766]: I1209 04:12:16.468512 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ss9t" event={"ID":"7bbf373a-2fa5-40fb-860c-5fa0bae15120","Type":"ContainerStarted","Data":"89f8bfd3a440f15e290c6aa3a85d038cc17aa5c49d4bd54e4d892d9eac58d0c3"} Dec 09 04:12:16 crc kubenswrapper[4766]: I1209 04:12:16.520419 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgs2g"] Dec 09 04:12:17 crc kubenswrapper[4766]: I1209 04:12:17.480985 4766 generic.go:334] "Generic (PLEG): container finished" podID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerID="aeb0d56d40ed7a879015fec6190dcc5614c20c0b1fa030708c49e6705659ccf4" exitCode=0 Dec 09 04:12:17 crc kubenswrapper[4766]: I1209 04:12:17.481086 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgs2g" event={"ID":"5676d1dd-aa35-4dd2-ab85-aa5667a16db2","Type":"ContainerDied","Data":"aeb0d56d40ed7a879015fec6190dcc5614c20c0b1fa030708c49e6705659ccf4"} Dec 09 04:12:17 crc kubenswrapper[4766]: I1209 04:12:17.481465 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgs2g" event={"ID":"5676d1dd-aa35-4dd2-ab85-aa5667a16db2","Type":"ContainerStarted","Data":"29b3ab79d217d73552626382a87d384619815379e7b1c14883337af8e9ee34b3"} Dec 09 04:12:18 crc kubenswrapper[4766]: I1209 04:12:18.492498 4766 generic.go:334] "Generic (PLEG): container finished" podID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerID="d8af936f4b531ffda1d102aa5bfb1a4f27de41eb348e7ce72747e4ef729d3bbe" exitCode=0 Dec 09 04:12:18 crc kubenswrapper[4766]: I1209 04:12:18.492583 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ss9t" event={"ID":"7bbf373a-2fa5-40fb-860c-5fa0bae15120","Type":"ContainerDied","Data":"d8af936f4b531ffda1d102aa5bfb1a4f27de41eb348e7ce72747e4ef729d3bbe"} Dec 09 04:12:18 crc kubenswrapper[4766]: I1209 04:12:18.497702 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgs2g" event={"ID":"5676d1dd-aa35-4dd2-ab85-aa5667a16db2","Type":"ContainerStarted","Data":"a9695620cd76eba18f772f471f3c34b7f1278512a32a86998a770dae16005a46"} Dec 09 04:12:19 crc kubenswrapper[4766]: I1209 04:12:19.507356 4766 generic.go:334] "Generic (PLEG): container finished" podID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerID="a9695620cd76eba18f772f471f3c34b7f1278512a32a86998a770dae16005a46" exitCode=0 Dec 09 04:12:19 crc kubenswrapper[4766]: I1209 04:12:19.507475 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgs2g" event={"ID":"5676d1dd-aa35-4dd2-ab85-aa5667a16db2","Type":"ContainerDied","Data":"a9695620cd76eba18f772f471f3c34b7f1278512a32a86998a770dae16005a46"} Dec 09 04:12:19 crc kubenswrapper[4766]: I1209 04:12:19.510927 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ss9t" event={"ID":"7bbf373a-2fa5-40fb-860c-5fa0bae15120","Type":"ContainerStarted","Data":"a78aa652326b801ebdd51a6344a5afb6244c34dcfcef3d995dc2903a84a50689"} Dec 09 04:12:19 crc kubenswrapper[4766]: I1209 04:12:19.553193 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4ss9t" podStartSLOduration=1.905008722 podStartE2EDuration="4.553178604s" podCreationTimestamp="2025-12-09 04:12:15 +0000 UTC" firstStartedPulling="2025-12-09 04:12:16.46962365 +0000 UTC m=+3618.178929076" lastFinishedPulling="2025-12-09 04:12:19.117793492 +0000 UTC m=+3620.827098958" observedRunningTime="2025-12-09 04:12:19.55117647 +0000 UTC m=+3621.260481916" watchObservedRunningTime="2025-12-09 04:12:19.553178604 +0000 UTC m=+3621.262484030" Dec 09 04:12:20 crc kubenswrapper[4766]: I1209 04:12:20.518909 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgs2g" event={"ID":"5676d1dd-aa35-4dd2-ab85-aa5667a16db2","Type":"ContainerStarted","Data":"72ce5fa298a1ade093fe382c3f20f869c5e768e3db48b40da3c9f5952c86df6f"} Dec 09 04:12:20 crc kubenswrapper[4766]: I1209 04:12:20.537771 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vgs2g" podStartSLOduration=3.106698133 podStartE2EDuration="5.537752705s" podCreationTimestamp="2025-12-09 04:12:15 +0000 UTC" firstStartedPulling="2025-12-09 04:12:17.483872253 +0000 UTC m=+3619.193177689" lastFinishedPulling="2025-12-09 04:12:19.914926835 +0000 UTC m=+3621.624232261" observedRunningTime="2025-12-09 04:12:20.535823654 +0000 UTC m=+3622.245129100" watchObservedRunningTime="2025-12-09 04:12:20.537752705 +0000 UTC m=+3622.247058141" Dec 09 04:12:22 crc kubenswrapper[4766]: I1209 04:12:22.839538 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:12:22 crc kubenswrapper[4766]: E1209 04:12:22.839941 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:12:25 crc kubenswrapper[4766]: I1209 04:12:25.830235 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:25 crc kubenswrapper[4766]: I1209 04:12:25.830530 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:25 crc kubenswrapper[4766]: I1209 04:12:25.868906 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:25 crc kubenswrapper[4766]: I1209 04:12:25.995892 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:25 crc kubenswrapper[4766]: I1209 04:12:25.995970 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:26 crc kubenswrapper[4766]: I1209 04:12:26.040069 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:26 crc kubenswrapper[4766]: I1209 04:12:26.616353 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:26 crc kubenswrapper[4766]: I1209 04:12:26.635858 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:27 crc kubenswrapper[4766]: I1209 04:12:27.279692 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4ss9t"] Dec 09 04:12:28 crc kubenswrapper[4766]: I1209 04:12:28.274159 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgs2g"] Dec 09 04:12:28 crc kubenswrapper[4766]: I1209 04:12:28.586719 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vgs2g" podUID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerName="registry-server" containerID="cri-o://72ce5fa298a1ade093fe382c3f20f869c5e768e3db48b40da3c9f5952c86df6f" gracePeriod=2 Dec 09 04:12:28 crc kubenswrapper[4766]: I1209 04:12:28.586897 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4ss9t" podUID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerName="registry-server" containerID="cri-o://a78aa652326b801ebdd51a6344a5afb6244c34dcfcef3d995dc2903a84a50689" gracePeriod=2 Dec 09 04:12:30 crc kubenswrapper[4766]: I1209 04:12:30.616766 4766 generic.go:334] "Generic (PLEG): container finished" podID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerID="72ce5fa298a1ade093fe382c3f20f869c5e768e3db48b40da3c9f5952c86df6f" exitCode=0 Dec 09 04:12:30 crc kubenswrapper[4766]: I1209 04:12:30.616856 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgs2g" event={"ID":"5676d1dd-aa35-4dd2-ab85-aa5667a16db2","Type":"ContainerDied","Data":"72ce5fa298a1ade093fe382c3f20f869c5e768e3db48b40da3c9f5952c86df6f"} Dec 09 04:12:30 crc kubenswrapper[4766]: I1209 04:12:30.620657 4766 generic.go:334] "Generic (PLEG): container finished" podID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerID="a78aa652326b801ebdd51a6344a5afb6244c34dcfcef3d995dc2903a84a50689" exitCode=0 Dec 09 04:12:30 crc kubenswrapper[4766]: I1209 04:12:30.620700 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ss9t" event={"ID":"7bbf373a-2fa5-40fb-860c-5fa0bae15120","Type":"ContainerDied","Data":"a78aa652326b801ebdd51a6344a5afb6244c34dcfcef3d995dc2903a84a50689"} Dec 09 04:12:30 crc kubenswrapper[4766]: I1209 04:12:30.914276 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:30 crc kubenswrapper[4766]: I1209 04:12:30.920430 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.040802 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-utilities\") pod \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.040851 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-catalog-content\") pod \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.040894 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-utilities\") pod \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.040933 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j78kn\" (UniqueName: \"kubernetes.io/projected/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-kube-api-access-j78kn\") pod \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.040977 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h594h\" (UniqueName: \"kubernetes.io/projected/7bbf373a-2fa5-40fb-860c-5fa0bae15120-kube-api-access-h594h\") pod \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\" (UID: \"7bbf373a-2fa5-40fb-860c-5fa0bae15120\") " Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.041015 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-catalog-content\") pod \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\" (UID: \"5676d1dd-aa35-4dd2-ab85-aa5667a16db2\") " Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.041768 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-utilities" (OuterVolumeSpecName: "utilities") pod "5676d1dd-aa35-4dd2-ab85-aa5667a16db2" (UID: "5676d1dd-aa35-4dd2-ab85-aa5667a16db2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.041828 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-utilities" (OuterVolumeSpecName: "utilities") pod "7bbf373a-2fa5-40fb-860c-5fa0bae15120" (UID: "7bbf373a-2fa5-40fb-860c-5fa0bae15120"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.046111 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbf373a-2fa5-40fb-860c-5fa0bae15120-kube-api-access-h594h" (OuterVolumeSpecName: "kube-api-access-h594h") pod "7bbf373a-2fa5-40fb-860c-5fa0bae15120" (UID: "7bbf373a-2fa5-40fb-860c-5fa0bae15120"). InnerVolumeSpecName "kube-api-access-h594h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.053444 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-kube-api-access-j78kn" (OuterVolumeSpecName: "kube-api-access-j78kn") pod "5676d1dd-aa35-4dd2-ab85-aa5667a16db2" (UID: "5676d1dd-aa35-4dd2-ab85-aa5667a16db2"). InnerVolumeSpecName "kube-api-access-j78kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.062603 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5676d1dd-aa35-4dd2-ab85-aa5667a16db2" (UID: "5676d1dd-aa35-4dd2-ab85-aa5667a16db2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.143036 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.143070 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j78kn\" (UniqueName: \"kubernetes.io/projected/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-kube-api-access-j78kn\") on node \"crc\" DevicePath \"\"" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.143084 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h594h\" (UniqueName: \"kubernetes.io/projected/7bbf373a-2fa5-40fb-860c-5fa0bae15120-kube-api-access-h594h\") on node \"crc\" DevicePath \"\"" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.143098 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5676d1dd-aa35-4dd2-ab85-aa5667a16db2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.143109 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.171368 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bbf373a-2fa5-40fb-860c-5fa0bae15120" (UID: "7bbf373a-2fa5-40fb-860c-5fa0bae15120"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.244583 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbf373a-2fa5-40fb-860c-5fa0bae15120-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.631088 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgs2g" event={"ID":"5676d1dd-aa35-4dd2-ab85-aa5667a16db2","Type":"ContainerDied","Data":"29b3ab79d217d73552626382a87d384619815379e7b1c14883337af8e9ee34b3"} Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.631145 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgs2g" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.632259 4766 scope.go:117] "RemoveContainer" containerID="72ce5fa298a1ade093fe382c3f20f869c5e768e3db48b40da3c9f5952c86df6f" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.643529 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4ss9t" event={"ID":"7bbf373a-2fa5-40fb-860c-5fa0bae15120","Type":"ContainerDied","Data":"89f8bfd3a440f15e290c6aa3a85d038cc17aa5c49d4bd54e4d892d9eac58d0c3"} Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.643717 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4ss9t" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.655200 4766 scope.go:117] "RemoveContainer" containerID="a9695620cd76eba18f772f471f3c34b7f1278512a32a86998a770dae16005a46" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.678913 4766 scope.go:117] "RemoveContainer" containerID="aeb0d56d40ed7a879015fec6190dcc5614c20c0b1fa030708c49e6705659ccf4" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.693492 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4ss9t"] Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.710450 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4ss9t"] Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.710635 4766 scope.go:117] "RemoveContainer" containerID="a78aa652326b801ebdd51a6344a5afb6244c34dcfcef3d995dc2903a84a50689" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.715998 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgs2g"] Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.720745 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgs2g"] Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.729042 4766 scope.go:117] "RemoveContainer" containerID="d8af936f4b531ffda1d102aa5bfb1a4f27de41eb348e7ce72747e4ef729d3bbe" Dec 09 04:12:31 crc kubenswrapper[4766]: I1209 04:12:31.749853 4766 scope.go:117] "RemoveContainer" containerID="e68fdbf174a784c1836f962f6e482021288e04d95fede6063308fb5d3d294dcd" Dec 09 04:12:32 crc kubenswrapper[4766]: I1209 04:12:32.849240 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" path="/var/lib/kubelet/pods/5676d1dd-aa35-4dd2-ab85-aa5667a16db2/volumes" Dec 09 04:12:32 crc kubenswrapper[4766]: I1209 04:12:32.849849 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" path="/var/lib/kubelet/pods/7bbf373a-2fa5-40fb-860c-5fa0bae15120/volumes" Dec 09 04:12:35 crc kubenswrapper[4766]: I1209 04:12:35.839139 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:12:35 crc kubenswrapper[4766]: E1209 04:12:35.839735 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:12:49 crc kubenswrapper[4766]: I1209 04:12:49.838634 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:12:49 crc kubenswrapper[4766]: E1209 04:12:49.839732 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:13:01 crc kubenswrapper[4766]: I1209 04:13:01.839282 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:13:01 crc kubenswrapper[4766]: E1209 04:13:01.840078 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:13:15 crc kubenswrapper[4766]: I1209 04:13:15.839152 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:13:15 crc kubenswrapper[4766]: E1209 04:13:15.840609 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:13:29 crc kubenswrapper[4766]: I1209 04:13:29.840020 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:13:29 crc kubenswrapper[4766]: E1209 04:13:29.840840 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:13:43 crc kubenswrapper[4766]: I1209 04:13:43.839523 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:13:43 crc kubenswrapper[4766]: E1209 04:13:43.840731 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:13:57 crc kubenswrapper[4766]: I1209 04:13:57.839839 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:13:57 crc kubenswrapper[4766]: E1209 04:13:57.840650 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:14:09 crc kubenswrapper[4766]: I1209 04:14:09.839454 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:14:09 crc kubenswrapper[4766]: E1209 04:14:09.840132 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:14:20 crc kubenswrapper[4766]: I1209 04:14:20.840095 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:14:20 crc kubenswrapper[4766]: E1209 04:14:20.840803 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:14:32 crc kubenswrapper[4766]: I1209 04:14:32.839297 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:14:32 crc kubenswrapper[4766]: E1209 04:14:32.840090 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:14:46 crc kubenswrapper[4766]: I1209 04:14:46.840367 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:14:46 crc kubenswrapper[4766]: E1209 04:14:46.841755 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.176174 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn"] Dec 09 04:15:00 crc kubenswrapper[4766]: E1209 04:15:00.177176 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerName="extract-content" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.177199 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerName="extract-content" Dec 09 04:15:00 crc kubenswrapper[4766]: E1209 04:15:00.177248 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerName="extract-utilities" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.177261 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerName="extract-utilities" Dec 09 04:15:00 crc kubenswrapper[4766]: E1209 04:15:00.177295 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerName="extract-utilities" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.177307 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerName="extract-utilities" Dec 09 04:15:00 crc kubenswrapper[4766]: E1209 04:15:00.177321 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerName="registry-server" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.177331 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerName="registry-server" Dec 09 04:15:00 crc kubenswrapper[4766]: E1209 04:15:00.177354 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerName="registry-server" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.177366 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerName="registry-server" Dec 09 04:15:00 crc kubenswrapper[4766]: E1209 04:15:00.177385 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerName="extract-content" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.177395 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerName="extract-content" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.177605 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5676d1dd-aa35-4dd2-ab85-aa5667a16db2" containerName="registry-server" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.177625 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbf373a-2fa5-40fb-860c-5fa0bae15120" containerName="registry-server" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.178207 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.181313 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.191285 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.191653 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn"] Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.287329 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be850a52-d57f-4c18-a9b2-209ad7879827-config-volume\") pod \"collect-profiles-29420895-76wnn\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.287620 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be850a52-d57f-4c18-a9b2-209ad7879827-secret-volume\") pod \"collect-profiles-29420895-76wnn\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.287746 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q2fl\" (UniqueName: \"kubernetes.io/projected/be850a52-d57f-4c18-a9b2-209ad7879827-kube-api-access-2q2fl\") pod \"collect-profiles-29420895-76wnn\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.388663 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be850a52-d57f-4c18-a9b2-209ad7879827-secret-volume\") pod \"collect-profiles-29420895-76wnn\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.388735 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q2fl\" (UniqueName: \"kubernetes.io/projected/be850a52-d57f-4c18-a9b2-209ad7879827-kube-api-access-2q2fl\") pod \"collect-profiles-29420895-76wnn\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.388776 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be850a52-d57f-4c18-a9b2-209ad7879827-config-volume\") pod \"collect-profiles-29420895-76wnn\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.389838 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be850a52-d57f-4c18-a9b2-209ad7879827-config-volume\") pod \"collect-profiles-29420895-76wnn\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.398564 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be850a52-d57f-4c18-a9b2-209ad7879827-secret-volume\") pod \"collect-profiles-29420895-76wnn\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.417616 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q2fl\" (UniqueName: \"kubernetes.io/projected/be850a52-d57f-4c18-a9b2-209ad7879827-kube-api-access-2q2fl\") pod \"collect-profiles-29420895-76wnn\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.497856 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:00 crc kubenswrapper[4766]: I1209 04:15:00.932316 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn"] Dec 09 04:15:01 crc kubenswrapper[4766]: I1209 04:15:01.009039 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" event={"ID":"be850a52-d57f-4c18-a9b2-209ad7879827","Type":"ContainerStarted","Data":"afb76b237ce5f5eeb8c1ac77791cbb1c3e96f74e7001cb010508c3069ce13758"} Dec 09 04:15:01 crc kubenswrapper[4766]: I1209 04:15:01.839129 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:15:01 crc kubenswrapper[4766]: E1209 04:15:01.839422 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:15:02 crc kubenswrapper[4766]: I1209 04:15:02.020146 4766 generic.go:334] "Generic (PLEG): container finished" podID="be850a52-d57f-4c18-a9b2-209ad7879827" containerID="c391c1b015598de9f6366dc22be451a03dcab1f5e6dba611fb7c34df1d269a44" exitCode=0 Dec 09 04:15:02 crc kubenswrapper[4766]: I1209 04:15:02.020197 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" event={"ID":"be850a52-d57f-4c18-a9b2-209ad7879827","Type":"ContainerDied","Data":"c391c1b015598de9f6366dc22be451a03dcab1f5e6dba611fb7c34df1d269a44"} Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.327763 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.426810 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q2fl\" (UniqueName: \"kubernetes.io/projected/be850a52-d57f-4c18-a9b2-209ad7879827-kube-api-access-2q2fl\") pod \"be850a52-d57f-4c18-a9b2-209ad7879827\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.427966 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be850a52-d57f-4c18-a9b2-209ad7879827-config-volume\") pod \"be850a52-d57f-4c18-a9b2-209ad7879827\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.428268 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be850a52-d57f-4c18-a9b2-209ad7879827-secret-volume\") pod \"be850a52-d57f-4c18-a9b2-209ad7879827\" (UID: \"be850a52-d57f-4c18-a9b2-209ad7879827\") " Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.428550 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be850a52-d57f-4c18-a9b2-209ad7879827-config-volume" (OuterVolumeSpecName: "config-volume") pod "be850a52-d57f-4c18-a9b2-209ad7879827" (UID: "be850a52-d57f-4c18-a9b2-209ad7879827"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.428728 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be850a52-d57f-4c18-a9b2-209ad7879827-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.433535 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be850a52-d57f-4c18-a9b2-209ad7879827-kube-api-access-2q2fl" (OuterVolumeSpecName: "kube-api-access-2q2fl") pod "be850a52-d57f-4c18-a9b2-209ad7879827" (UID: "be850a52-d57f-4c18-a9b2-209ad7879827"). InnerVolumeSpecName "kube-api-access-2q2fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.436451 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be850a52-d57f-4c18-a9b2-209ad7879827-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "be850a52-d57f-4c18-a9b2-209ad7879827" (UID: "be850a52-d57f-4c18-a9b2-209ad7879827"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.530063 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q2fl\" (UniqueName: \"kubernetes.io/projected/be850a52-d57f-4c18-a9b2-209ad7879827-kube-api-access-2q2fl\") on node \"crc\" DevicePath \"\"" Dec 09 04:15:03 crc kubenswrapper[4766]: I1209 04:15:03.530112 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be850a52-d57f-4c18-a9b2-209ad7879827-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 04:15:04 crc kubenswrapper[4766]: I1209 04:15:04.046838 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" event={"ID":"be850a52-d57f-4c18-a9b2-209ad7879827","Type":"ContainerDied","Data":"afb76b237ce5f5eeb8c1ac77791cbb1c3e96f74e7001cb010508c3069ce13758"} Dec 09 04:15:04 crc kubenswrapper[4766]: I1209 04:15:04.047263 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afb76b237ce5f5eeb8c1ac77791cbb1c3e96f74e7001cb010508c3069ce13758" Dec 09 04:15:04 crc kubenswrapper[4766]: I1209 04:15:04.047504 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn" Dec 09 04:15:04 crc kubenswrapper[4766]: I1209 04:15:04.443391 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd"] Dec 09 04:15:04 crc kubenswrapper[4766]: I1209 04:15:04.449022 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420850-w6fpd"] Dec 09 04:15:04 crc kubenswrapper[4766]: I1209 04:15:04.846879 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc3c00f-71ec-4d0e-8e7e-9d3476e35995" path="/var/lib/kubelet/pods/0bc3c00f-71ec-4d0e-8e7e-9d3476e35995/volumes" Dec 09 04:15:12 crc kubenswrapper[4766]: I1209 04:15:12.877188 4766 scope.go:117] "RemoveContainer" containerID="6fd5729ab6e5bde7239983435485a5329d84ed065309b7d8a8c00d11612017b0" Dec 09 04:15:16 crc kubenswrapper[4766]: I1209 04:15:16.839371 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:15:16 crc kubenswrapper[4766]: E1209 04:15:16.839979 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:15:31 crc kubenswrapper[4766]: I1209 04:15:31.839385 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:15:31 crc kubenswrapper[4766]: E1209 04:15:31.840436 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:15:42 crc kubenswrapper[4766]: I1209 04:15:42.839878 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:15:42 crc kubenswrapper[4766]: E1209 04:15:42.840607 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:15:53 crc kubenswrapper[4766]: I1209 04:15:53.839328 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:15:53 crc kubenswrapper[4766]: E1209 04:15:53.840199 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:16:05 crc kubenswrapper[4766]: I1209 04:16:05.840140 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:16:05 crc kubenswrapper[4766]: E1209 04:16:05.841280 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:16:16 crc kubenswrapper[4766]: I1209 04:16:16.839016 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:16:16 crc kubenswrapper[4766]: E1209 04:16:16.839637 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:16:28 crc kubenswrapper[4766]: I1209 04:16:28.848425 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:16:28 crc kubenswrapper[4766]: E1209 04:16:28.849517 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:16:42 crc kubenswrapper[4766]: I1209 04:16:42.840014 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:16:43 crc kubenswrapper[4766]: I1209 04:16:43.914362 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"c5a7e8c76f2e51d9c804a5f3e67b38e0c969ac446f32a8ed39903e76c208d54b"} Dec 09 04:19:07 crc kubenswrapper[4766]: I1209 04:19:07.316836 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:19:07 crc kubenswrapper[4766]: I1209 04:19:07.317502 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.807857 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f4x9d"] Dec 09 04:19:33 crc kubenswrapper[4766]: E1209 04:19:33.810508 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be850a52-d57f-4c18-a9b2-209ad7879827" containerName="collect-profiles" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.810665 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="be850a52-d57f-4c18-a9b2-209ad7879827" containerName="collect-profiles" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.811009 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="be850a52-d57f-4c18-a9b2-209ad7879827" containerName="collect-profiles" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.812876 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.822142 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-utilities\") pod \"community-operators-f4x9d\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.822292 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kf6x\" (UniqueName: \"kubernetes.io/projected/790acac9-5b74-45e4-a18e-08c21c6d21cb-kube-api-access-2kf6x\") pod \"community-operators-f4x9d\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.822372 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-catalog-content\") pod \"community-operators-f4x9d\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.826153 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f4x9d"] Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.923357 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kf6x\" (UniqueName: \"kubernetes.io/projected/790acac9-5b74-45e4-a18e-08c21c6d21cb-kube-api-access-2kf6x\") pod \"community-operators-f4x9d\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.923648 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-catalog-content\") pod \"community-operators-f4x9d\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.923885 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-utilities\") pod \"community-operators-f4x9d\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.924689 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-utilities\") pod \"community-operators-f4x9d\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.924721 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-catalog-content\") pod \"community-operators-f4x9d\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:33 crc kubenswrapper[4766]: I1209 04:19:33.948347 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kf6x\" (UniqueName: \"kubernetes.io/projected/790acac9-5b74-45e4-a18e-08c21c6d21cb-kube-api-access-2kf6x\") pod \"community-operators-f4x9d\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:34 crc kubenswrapper[4766]: I1209 04:19:34.148732 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:34 crc kubenswrapper[4766]: I1209 04:19:34.647741 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f4x9d"] Dec 09 04:19:35 crc kubenswrapper[4766]: I1209 04:19:35.622465 4766 generic.go:334] "Generic (PLEG): container finished" podID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerID="0366d0d1e14bfda0b31a53a1e7c05662d5156022a3af99788dfdd049c0f6a767" exitCode=0 Dec 09 04:19:35 crc kubenswrapper[4766]: I1209 04:19:35.622589 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4x9d" event={"ID":"790acac9-5b74-45e4-a18e-08c21c6d21cb","Type":"ContainerDied","Data":"0366d0d1e14bfda0b31a53a1e7c05662d5156022a3af99788dfdd049c0f6a767"} Dec 09 04:19:35 crc kubenswrapper[4766]: I1209 04:19:35.622813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4x9d" event={"ID":"790acac9-5b74-45e4-a18e-08c21c6d21cb","Type":"ContainerStarted","Data":"9ce18f395ecb384a442423b1aa0d27c2619a78a0a649cb3e34c6e463807aa150"} Dec 09 04:19:35 crc kubenswrapper[4766]: I1209 04:19:35.626488 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 04:19:36 crc kubenswrapper[4766]: I1209 04:19:36.632719 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4x9d" event={"ID":"790acac9-5b74-45e4-a18e-08c21c6d21cb","Type":"ContainerStarted","Data":"55e9aaa2c2db2ea559447292cec8b686b2175c3066695b564aa0a49772998a84"} Dec 09 04:19:37 crc kubenswrapper[4766]: I1209 04:19:37.317122 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:19:37 crc kubenswrapper[4766]: I1209 04:19:37.317193 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:19:37 crc kubenswrapper[4766]: I1209 04:19:37.646978 4766 generic.go:334] "Generic (PLEG): container finished" podID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerID="55e9aaa2c2db2ea559447292cec8b686b2175c3066695b564aa0a49772998a84" exitCode=0 Dec 09 04:19:37 crc kubenswrapper[4766]: I1209 04:19:37.647043 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4x9d" event={"ID":"790acac9-5b74-45e4-a18e-08c21c6d21cb","Type":"ContainerDied","Data":"55e9aaa2c2db2ea559447292cec8b686b2175c3066695b564aa0a49772998a84"} Dec 09 04:19:38 crc kubenswrapper[4766]: I1209 04:19:38.664790 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4x9d" event={"ID":"790acac9-5b74-45e4-a18e-08c21c6d21cb","Type":"ContainerStarted","Data":"93b39be6331fd5f679ec0c47b941c137d67cb6f7a11dd754bb2ec35a39a4a766"} Dec 09 04:19:38 crc kubenswrapper[4766]: I1209 04:19:38.685822 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f4x9d" podStartSLOduration=3.280197839 podStartE2EDuration="5.685806902s" podCreationTimestamp="2025-12-09 04:19:33 +0000 UTC" firstStartedPulling="2025-12-09 04:19:35.625822356 +0000 UTC m=+4057.335127822" lastFinishedPulling="2025-12-09 04:19:38.031431419 +0000 UTC m=+4059.740736885" observedRunningTime="2025-12-09 04:19:38.682164954 +0000 UTC m=+4060.391470390" watchObservedRunningTime="2025-12-09 04:19:38.685806902 +0000 UTC m=+4060.395112328" Dec 09 04:19:44 crc kubenswrapper[4766]: I1209 04:19:44.150831 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:44 crc kubenswrapper[4766]: I1209 04:19:44.151632 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:44 crc kubenswrapper[4766]: I1209 04:19:44.197991 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:44 crc kubenswrapper[4766]: I1209 04:19:44.774637 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:44 crc kubenswrapper[4766]: I1209 04:19:44.858334 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f4x9d"] Dec 09 04:19:46 crc kubenswrapper[4766]: I1209 04:19:46.728969 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f4x9d" podUID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerName="registry-server" containerID="cri-o://93b39be6331fd5f679ec0c47b941c137d67cb6f7a11dd754bb2ec35a39a4a766" gracePeriod=2 Dec 09 04:19:48 crc kubenswrapper[4766]: I1209 04:19:48.751643 4766 generic.go:334] "Generic (PLEG): container finished" podID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerID="93b39be6331fd5f679ec0c47b941c137d67cb6f7a11dd754bb2ec35a39a4a766" exitCode=0 Dec 09 04:19:48 crc kubenswrapper[4766]: I1209 04:19:48.751705 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4x9d" event={"ID":"790acac9-5b74-45e4-a18e-08c21c6d21cb","Type":"ContainerDied","Data":"93b39be6331fd5f679ec0c47b941c137d67cb6f7a11dd754bb2ec35a39a4a766"} Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.084262 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.259411 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kf6x\" (UniqueName: \"kubernetes.io/projected/790acac9-5b74-45e4-a18e-08c21c6d21cb-kube-api-access-2kf6x\") pod \"790acac9-5b74-45e4-a18e-08c21c6d21cb\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.259866 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-catalog-content\") pod \"790acac9-5b74-45e4-a18e-08c21c6d21cb\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.259974 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-utilities\") pod \"790acac9-5b74-45e4-a18e-08c21c6d21cb\" (UID: \"790acac9-5b74-45e4-a18e-08c21c6d21cb\") " Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.261062 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-utilities" (OuterVolumeSpecName: "utilities") pod "790acac9-5b74-45e4-a18e-08c21c6d21cb" (UID: "790acac9-5b74-45e4-a18e-08c21c6d21cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.264646 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790acac9-5b74-45e4-a18e-08c21c6d21cb-kube-api-access-2kf6x" (OuterVolumeSpecName: "kube-api-access-2kf6x") pod "790acac9-5b74-45e4-a18e-08c21c6d21cb" (UID: "790acac9-5b74-45e4-a18e-08c21c6d21cb"). InnerVolumeSpecName "kube-api-access-2kf6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.315421 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "790acac9-5b74-45e4-a18e-08c21c6d21cb" (UID: "790acac9-5b74-45e4-a18e-08c21c6d21cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.361329 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.361360 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790acac9-5b74-45e4-a18e-08c21c6d21cb-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.361370 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kf6x\" (UniqueName: \"kubernetes.io/projected/790acac9-5b74-45e4-a18e-08c21c6d21cb-kube-api-access-2kf6x\") on node \"crc\" DevicePath \"\"" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.780544 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4x9d" event={"ID":"790acac9-5b74-45e4-a18e-08c21c6d21cb","Type":"ContainerDied","Data":"9ce18f395ecb384a442423b1aa0d27c2619a78a0a649cb3e34c6e463807aa150"} Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.780627 4766 scope.go:117] "RemoveContainer" containerID="93b39be6331fd5f679ec0c47b941c137d67cb6f7a11dd754bb2ec35a39a4a766" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.780702 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4x9d" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.812986 4766 scope.go:117] "RemoveContainer" containerID="55e9aaa2c2db2ea559447292cec8b686b2175c3066695b564aa0a49772998a84" Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.825699 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f4x9d"] Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.831035 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f4x9d"] Dec 09 04:19:49 crc kubenswrapper[4766]: I1209 04:19:49.838245 4766 scope.go:117] "RemoveContainer" containerID="0366d0d1e14bfda0b31a53a1e7c05662d5156022a3af99788dfdd049c0f6a767" Dec 09 04:19:50 crc kubenswrapper[4766]: I1209 04:19:50.847789 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790acac9-5b74-45e4-a18e-08c21c6d21cb" path="/var/lib/kubelet/pods/790acac9-5b74-45e4-a18e-08c21c6d21cb/volumes" Dec 09 04:20:07 crc kubenswrapper[4766]: I1209 04:20:07.316459 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:20:07 crc kubenswrapper[4766]: I1209 04:20:07.317173 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:20:07 crc kubenswrapper[4766]: I1209 04:20:07.317289 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:20:07 crc kubenswrapper[4766]: I1209 04:20:07.318172 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5a7e8c76f2e51d9c804a5f3e67b38e0c969ac446f32a8ed39903e76c208d54b"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:20:07 crc kubenswrapper[4766]: I1209 04:20:07.318356 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://c5a7e8c76f2e51d9c804a5f3e67b38e0c969ac446f32a8ed39903e76c208d54b" gracePeriod=600 Dec 09 04:20:07 crc kubenswrapper[4766]: I1209 04:20:07.947457 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="c5a7e8c76f2e51d9c804a5f3e67b38e0c969ac446f32a8ed39903e76c208d54b" exitCode=0 Dec 09 04:20:07 crc kubenswrapper[4766]: I1209 04:20:07.947520 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"c5a7e8c76f2e51d9c804a5f3e67b38e0c969ac446f32a8ed39903e76c208d54b"} Dec 09 04:20:07 crc kubenswrapper[4766]: I1209 04:20:07.947802 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616"} Dec 09 04:20:07 crc kubenswrapper[4766]: I1209 04:20:07.947833 4766 scope.go:117] "RemoveContainer" containerID="2cf7701da1648155c37a641bc13b0abbaf918d82703c0729ee23807f2acec65d" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.670073 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-74sb9"] Dec 09 04:21:52 crc kubenswrapper[4766]: E1209 04:21:52.671023 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerName="extract-content" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.671040 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerName="extract-content" Dec 09 04:21:52 crc kubenswrapper[4766]: E1209 04:21:52.671069 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerName="registry-server" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.671077 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerName="registry-server" Dec 09 04:21:52 crc kubenswrapper[4766]: E1209 04:21:52.671103 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerName="extract-utilities" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.671109 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerName="extract-utilities" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.671270 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="790acac9-5b74-45e4-a18e-08c21c6d21cb" containerName="registry-server" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.672257 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.686136 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74sb9"] Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.861799 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-utilities\") pod \"certified-operators-74sb9\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.861866 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-catalog-content\") pod \"certified-operators-74sb9\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.861940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kjvm\" (UniqueName: \"kubernetes.io/projected/94aea104-da8a-4a93-b797-a58e84fdde35-kube-api-access-7kjvm\") pod \"certified-operators-74sb9\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.962979 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-catalog-content\") pod \"certified-operators-74sb9\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.963042 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kjvm\" (UniqueName: \"kubernetes.io/projected/94aea104-da8a-4a93-b797-a58e84fdde35-kube-api-access-7kjvm\") pod \"certified-operators-74sb9\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.963793 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-catalog-content\") pod \"certified-operators-74sb9\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.963984 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-utilities\") pod \"certified-operators-74sb9\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.964329 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-utilities\") pod \"certified-operators-74sb9\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:52 crc kubenswrapper[4766]: I1209 04:21:52.984915 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kjvm\" (UniqueName: \"kubernetes.io/projected/94aea104-da8a-4a93-b797-a58e84fdde35-kube-api-access-7kjvm\") pod \"certified-operators-74sb9\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:53 crc kubenswrapper[4766]: I1209 04:21:53.029563 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:21:53 crc kubenswrapper[4766]: I1209 04:21:53.510044 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74sb9"] Dec 09 04:21:53 crc kubenswrapper[4766]: I1209 04:21:53.902523 4766 generic.go:334] "Generic (PLEG): container finished" podID="94aea104-da8a-4a93-b797-a58e84fdde35" containerID="76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471" exitCode=0 Dec 09 04:21:53 crc kubenswrapper[4766]: I1209 04:21:53.902568 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74sb9" event={"ID":"94aea104-da8a-4a93-b797-a58e84fdde35","Type":"ContainerDied","Data":"76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471"} Dec 09 04:21:53 crc kubenswrapper[4766]: I1209 04:21:53.902594 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74sb9" event={"ID":"94aea104-da8a-4a93-b797-a58e84fdde35","Type":"ContainerStarted","Data":"a6c1408b6cdbf500e3d40fe0a6fdf4c88de69099e252aa0d018896ba07acc3f2"} Dec 09 04:21:54 crc kubenswrapper[4766]: I1209 04:21:54.912316 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74sb9" event={"ID":"94aea104-da8a-4a93-b797-a58e84fdde35","Type":"ContainerStarted","Data":"0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d"} Dec 09 04:21:55 crc kubenswrapper[4766]: I1209 04:21:55.923417 4766 generic.go:334] "Generic (PLEG): container finished" podID="94aea104-da8a-4a93-b797-a58e84fdde35" containerID="0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d" exitCode=0 Dec 09 04:21:55 crc kubenswrapper[4766]: I1209 04:21:55.923499 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74sb9" event={"ID":"94aea104-da8a-4a93-b797-a58e84fdde35","Type":"ContainerDied","Data":"0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d"} Dec 09 04:21:56 crc kubenswrapper[4766]: I1209 04:21:56.933553 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74sb9" event={"ID":"94aea104-da8a-4a93-b797-a58e84fdde35","Type":"ContainerStarted","Data":"751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c"} Dec 09 04:21:56 crc kubenswrapper[4766]: I1209 04:21:56.955230 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-74sb9" podStartSLOduration=2.5079451170000002 podStartE2EDuration="4.955192417s" podCreationTimestamp="2025-12-09 04:21:52 +0000 UTC" firstStartedPulling="2025-12-09 04:21:53.904278284 +0000 UTC m=+4195.613583730" lastFinishedPulling="2025-12-09 04:21:56.351525594 +0000 UTC m=+4198.060831030" observedRunningTime="2025-12-09 04:21:56.953555852 +0000 UTC m=+4198.662861278" watchObservedRunningTime="2025-12-09 04:21:56.955192417 +0000 UTC m=+4198.664497843" Dec 09 04:22:03 crc kubenswrapper[4766]: I1209 04:22:03.029725 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:22:03 crc kubenswrapper[4766]: I1209 04:22:03.031419 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:22:03 crc kubenswrapper[4766]: I1209 04:22:03.087906 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:22:04 crc kubenswrapper[4766]: I1209 04:22:04.060106 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:22:04 crc kubenswrapper[4766]: I1209 04:22:04.116620 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74sb9"] Dec 09 04:22:06 crc kubenswrapper[4766]: I1209 04:22:06.004447 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-74sb9" podUID="94aea104-da8a-4a93-b797-a58e84fdde35" containerName="registry-server" containerID="cri-o://751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c" gracePeriod=2 Dec 09 04:22:06 crc kubenswrapper[4766]: I1209 04:22:06.953493 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.012192 4766 generic.go:334] "Generic (PLEG): container finished" podID="94aea104-da8a-4a93-b797-a58e84fdde35" containerID="751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c" exitCode=0 Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.012251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74sb9" event={"ID":"94aea104-da8a-4a93-b797-a58e84fdde35","Type":"ContainerDied","Data":"751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c"} Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.012276 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74sb9" event={"ID":"94aea104-da8a-4a93-b797-a58e84fdde35","Type":"ContainerDied","Data":"a6c1408b6cdbf500e3d40fe0a6fdf4c88de69099e252aa0d018896ba07acc3f2"} Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.012291 4766 scope.go:117] "RemoveContainer" containerID="751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.012302 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74sb9" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.029491 4766 scope.go:117] "RemoveContainer" containerID="0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.048319 4766 scope.go:117] "RemoveContainer" containerID="76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.061057 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-catalog-content\") pod \"94aea104-da8a-4a93-b797-a58e84fdde35\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.061110 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-utilities\") pod \"94aea104-da8a-4a93-b797-a58e84fdde35\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.061192 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kjvm\" (UniqueName: \"kubernetes.io/projected/94aea104-da8a-4a93-b797-a58e84fdde35-kube-api-access-7kjvm\") pod \"94aea104-da8a-4a93-b797-a58e84fdde35\" (UID: \"94aea104-da8a-4a93-b797-a58e84fdde35\") " Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.062119 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-utilities" (OuterVolumeSpecName: "utilities") pod "94aea104-da8a-4a93-b797-a58e84fdde35" (UID: "94aea104-da8a-4a93-b797-a58e84fdde35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.071457 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94aea104-da8a-4a93-b797-a58e84fdde35-kube-api-access-7kjvm" (OuterVolumeSpecName: "kube-api-access-7kjvm") pod "94aea104-da8a-4a93-b797-a58e84fdde35" (UID: "94aea104-da8a-4a93-b797-a58e84fdde35"). InnerVolumeSpecName "kube-api-access-7kjvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.107989 4766 scope.go:117] "RemoveContainer" containerID="751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c" Dec 09 04:22:07 crc kubenswrapper[4766]: E1209 04:22:07.108659 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c\": container with ID starting with 751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c not found: ID does not exist" containerID="751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.108714 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c"} err="failed to get container status \"751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c\": rpc error: code = NotFound desc = could not find container \"751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c\": container with ID starting with 751fda4e293712c9c79e8cafaf3a2c70735fc045a9712376ea24693a38ae048c not found: ID does not exist" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.108747 4766 scope.go:117] "RemoveContainer" containerID="0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d" Dec 09 04:22:07 crc kubenswrapper[4766]: E1209 04:22:07.109180 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d\": container with ID starting with 0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d not found: ID does not exist" containerID="0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.109261 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d"} err="failed to get container status \"0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d\": rpc error: code = NotFound desc = could not find container \"0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d\": container with ID starting with 0c957d2e45805c5d0463601d03be2cbf96901bf243a199ae3bd1e905a89c393d not found: ID does not exist" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.109294 4766 scope.go:117] "RemoveContainer" containerID="76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471" Dec 09 04:22:07 crc kubenswrapper[4766]: E1209 04:22:07.109631 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471\": container with ID starting with 76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471 not found: ID does not exist" containerID="76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.109659 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471"} err="failed to get container status \"76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471\": rpc error: code = NotFound desc = could not find container \"76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471\": container with ID starting with 76bf0309d5d4238085d9e4f10e211e1cba40cbc59a8d0ecbbc5dea266119b471 not found: ID does not exist" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.116668 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94aea104-da8a-4a93-b797-a58e84fdde35" (UID: "94aea104-da8a-4a93-b797-a58e84fdde35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.162722 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.162960 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94aea104-da8a-4a93-b797-a58e84fdde35-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.163021 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kjvm\" (UniqueName: \"kubernetes.io/projected/94aea104-da8a-4a93-b797-a58e84fdde35-kube-api-access-7kjvm\") on node \"crc\" DevicePath \"\"" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.316407 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.316761 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.369031 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74sb9"] Dec 09 04:22:07 crc kubenswrapper[4766]: I1209 04:22:07.377939 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-74sb9"] Dec 09 04:22:08 crc kubenswrapper[4766]: I1209 04:22:08.855455 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94aea104-da8a-4a93-b797-a58e84fdde35" path="/var/lib/kubelet/pods/94aea104-da8a-4a93-b797-a58e84fdde35/volumes" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.614535 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4b62h"] Dec 09 04:22:30 crc kubenswrapper[4766]: E1209 04:22:30.615564 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94aea104-da8a-4a93-b797-a58e84fdde35" containerName="registry-server" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.615586 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="94aea104-da8a-4a93-b797-a58e84fdde35" containerName="registry-server" Dec 09 04:22:30 crc kubenswrapper[4766]: E1209 04:22:30.615639 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94aea104-da8a-4a93-b797-a58e84fdde35" containerName="extract-content" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.615652 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="94aea104-da8a-4a93-b797-a58e84fdde35" containerName="extract-content" Dec 09 04:22:30 crc kubenswrapper[4766]: E1209 04:22:30.615674 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94aea104-da8a-4a93-b797-a58e84fdde35" containerName="extract-utilities" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.615686 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="94aea104-da8a-4a93-b797-a58e84fdde35" containerName="extract-utilities" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.615912 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="94aea104-da8a-4a93-b797-a58e84fdde35" containerName="registry-server" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.621593 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.630111 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4b62h"] Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.805082 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pfhtm"] Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.805723 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-utilities\") pod \"redhat-marketplace-4b62h\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.805881 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-catalog-content\") pod \"redhat-marketplace-4b62h\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.805923 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwkpr\" (UniqueName: \"kubernetes.io/projected/fbe83c1b-c285-4900-bef8-32038634713e-kube-api-access-rwkpr\") pod \"redhat-marketplace-4b62h\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.806498 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.815305 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pfhtm"] Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.907030 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-catalog-content\") pod \"redhat-marketplace-4b62h\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.907092 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-catalog-content\") pod \"redhat-operators-pfhtm\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.907131 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwkpr\" (UniqueName: \"kubernetes.io/projected/fbe83c1b-c285-4900-bef8-32038634713e-kube-api-access-rwkpr\") pod \"redhat-marketplace-4b62h\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.907169 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-utilities\") pod \"redhat-marketplace-4b62h\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.907298 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29xbm\" (UniqueName: \"kubernetes.io/projected/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-kube-api-access-29xbm\") pod \"redhat-operators-pfhtm\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.907330 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-utilities\") pod \"redhat-operators-pfhtm\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.907861 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-utilities\") pod \"redhat-marketplace-4b62h\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.907882 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-catalog-content\") pod \"redhat-marketplace-4b62h\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.927850 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwkpr\" (UniqueName: \"kubernetes.io/projected/fbe83c1b-c285-4900-bef8-32038634713e-kube-api-access-rwkpr\") pod \"redhat-marketplace-4b62h\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:30 crc kubenswrapper[4766]: I1209 04:22:30.986910 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:31 crc kubenswrapper[4766]: I1209 04:22:31.008288 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-catalog-content\") pod \"redhat-operators-pfhtm\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:31 crc kubenswrapper[4766]: I1209 04:22:31.008470 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29xbm\" (UniqueName: \"kubernetes.io/projected/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-kube-api-access-29xbm\") pod \"redhat-operators-pfhtm\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:31 crc kubenswrapper[4766]: I1209 04:22:31.008509 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-utilities\") pod \"redhat-operators-pfhtm\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:31 crc kubenswrapper[4766]: I1209 04:22:31.008808 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-catalog-content\") pod \"redhat-operators-pfhtm\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:31 crc kubenswrapper[4766]: I1209 04:22:31.009123 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-utilities\") pod \"redhat-operators-pfhtm\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:31 crc kubenswrapper[4766]: I1209 04:22:31.040024 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29xbm\" (UniqueName: \"kubernetes.io/projected/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-kube-api-access-29xbm\") pod \"redhat-operators-pfhtm\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:31 crc kubenswrapper[4766]: I1209 04:22:31.122132 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:31 crc kubenswrapper[4766]: I1209 04:22:31.467723 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4b62h"] Dec 09 04:22:31 crc kubenswrapper[4766]: I1209 04:22:31.567127 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pfhtm"] Dec 09 04:22:31 crc kubenswrapper[4766]: W1209 04:22:31.568803 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda76da2ca_d2f3_426c_8ae2_e7bac5de78bd.slice/crio-ec7c7e8ee386b01fb701bf895094daa53418cde80504f4c494839aae1dd80df4 WatchSource:0}: Error finding container ec7c7e8ee386b01fb701bf895094daa53418cde80504f4c494839aae1dd80df4: Status 404 returned error can't find the container with id ec7c7e8ee386b01fb701bf895094daa53418cde80504f4c494839aae1dd80df4 Dec 09 04:22:32 crc kubenswrapper[4766]: I1209 04:22:32.206920 4766 generic.go:334] "Generic (PLEG): container finished" podID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerID="167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f" exitCode=0 Dec 09 04:22:32 crc kubenswrapper[4766]: I1209 04:22:32.206964 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfhtm" event={"ID":"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd","Type":"ContainerDied","Data":"167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f"} Dec 09 04:22:32 crc kubenswrapper[4766]: I1209 04:22:32.207001 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfhtm" event={"ID":"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd","Type":"ContainerStarted","Data":"ec7c7e8ee386b01fb701bf895094daa53418cde80504f4c494839aae1dd80df4"} Dec 09 04:22:32 crc kubenswrapper[4766]: I1209 04:22:32.208600 4766 generic.go:334] "Generic (PLEG): container finished" podID="fbe83c1b-c285-4900-bef8-32038634713e" containerID="68474e04a5d18440d62b2ca97fba3cc99c473e6a4b3573c28329cd051f77d731" exitCode=0 Dec 09 04:22:32 crc kubenswrapper[4766]: I1209 04:22:32.208622 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4b62h" event={"ID":"fbe83c1b-c285-4900-bef8-32038634713e","Type":"ContainerDied","Data":"68474e04a5d18440d62b2ca97fba3cc99c473e6a4b3573c28329cd051f77d731"} Dec 09 04:22:32 crc kubenswrapper[4766]: I1209 04:22:32.208647 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4b62h" event={"ID":"fbe83c1b-c285-4900-bef8-32038634713e","Type":"ContainerStarted","Data":"1e7c4539603e96b91736a4e8c80387e47bf4b85fe1dfe7eaada03ce517314981"} Dec 09 04:22:33 crc kubenswrapper[4766]: I1209 04:22:33.215695 4766 generic.go:334] "Generic (PLEG): container finished" podID="fbe83c1b-c285-4900-bef8-32038634713e" containerID="fc40ba16743d3c20789eed9c73e9d27dfff4f8df4f52eee8fbe3692079e13f00" exitCode=0 Dec 09 04:22:33 crc kubenswrapper[4766]: I1209 04:22:33.215774 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4b62h" event={"ID":"fbe83c1b-c285-4900-bef8-32038634713e","Type":"ContainerDied","Data":"fc40ba16743d3c20789eed9c73e9d27dfff4f8df4f52eee8fbe3692079e13f00"} Dec 09 04:22:33 crc kubenswrapper[4766]: I1209 04:22:33.218657 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfhtm" event={"ID":"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd","Type":"ContainerStarted","Data":"1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86"} Dec 09 04:22:34 crc kubenswrapper[4766]: I1209 04:22:34.228449 4766 generic.go:334] "Generic (PLEG): container finished" podID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerID="1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86" exitCode=0 Dec 09 04:22:34 crc kubenswrapper[4766]: I1209 04:22:34.228554 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfhtm" event={"ID":"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd","Type":"ContainerDied","Data":"1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86"} Dec 09 04:22:34 crc kubenswrapper[4766]: I1209 04:22:34.232309 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4b62h" event={"ID":"fbe83c1b-c285-4900-bef8-32038634713e","Type":"ContainerStarted","Data":"6e182c97fcdeda8e604f96d9e09348ce8a5a1de0df35a56e3163bf8bd94460a4"} Dec 09 04:22:34 crc kubenswrapper[4766]: I1209 04:22:34.278830 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4b62h" podStartSLOduration=2.657464279 podStartE2EDuration="4.278811942s" podCreationTimestamp="2025-12-09 04:22:30 +0000 UTC" firstStartedPulling="2025-12-09 04:22:32.211555848 +0000 UTC m=+4233.920861274" lastFinishedPulling="2025-12-09 04:22:33.832903511 +0000 UTC m=+4235.542208937" observedRunningTime="2025-12-09 04:22:34.273880558 +0000 UTC m=+4235.983185994" watchObservedRunningTime="2025-12-09 04:22:34.278811942 +0000 UTC m=+4235.988117368" Dec 09 04:22:35 crc kubenswrapper[4766]: I1209 04:22:35.242927 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfhtm" event={"ID":"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd","Type":"ContainerStarted","Data":"b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f"} Dec 09 04:22:35 crc kubenswrapper[4766]: I1209 04:22:35.270413 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pfhtm" podStartSLOduration=2.82444427 podStartE2EDuration="5.270393165s" podCreationTimestamp="2025-12-09 04:22:30 +0000 UTC" firstStartedPulling="2025-12-09 04:22:32.21013585 +0000 UTC m=+4233.919441286" lastFinishedPulling="2025-12-09 04:22:34.656084755 +0000 UTC m=+4236.365390181" observedRunningTime="2025-12-09 04:22:35.264258499 +0000 UTC m=+4236.973563935" watchObservedRunningTime="2025-12-09 04:22:35.270393165 +0000 UTC m=+4236.979698601" Dec 09 04:22:37 crc kubenswrapper[4766]: I1209 04:22:37.317042 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:22:37 crc kubenswrapper[4766]: I1209 04:22:37.317394 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:22:40 crc kubenswrapper[4766]: I1209 04:22:40.988465 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:40 crc kubenswrapper[4766]: I1209 04:22:40.988552 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:41 crc kubenswrapper[4766]: I1209 04:22:41.038281 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:41 crc kubenswrapper[4766]: I1209 04:22:41.123002 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:41 crc kubenswrapper[4766]: I1209 04:22:41.123332 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:41 crc kubenswrapper[4766]: I1209 04:22:41.340924 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:41 crc kubenswrapper[4766]: I1209 04:22:41.385183 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4b62h"] Dec 09 04:22:42 crc kubenswrapper[4766]: I1209 04:22:42.159611 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pfhtm" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerName="registry-server" probeResult="failure" output=< Dec 09 04:22:42 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 04:22:42 crc kubenswrapper[4766]: > Dec 09 04:22:43 crc kubenswrapper[4766]: I1209 04:22:43.303580 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4b62h" podUID="fbe83c1b-c285-4900-bef8-32038634713e" containerName="registry-server" containerID="cri-o://6e182c97fcdeda8e604f96d9e09348ce8a5a1de0df35a56e3163bf8bd94460a4" gracePeriod=2 Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.323847 4766 generic.go:334] "Generic (PLEG): container finished" podID="fbe83c1b-c285-4900-bef8-32038634713e" containerID="6e182c97fcdeda8e604f96d9e09348ce8a5a1de0df35a56e3163bf8bd94460a4" exitCode=0 Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.323924 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4b62h" event={"ID":"fbe83c1b-c285-4900-bef8-32038634713e","Type":"ContainerDied","Data":"6e182c97fcdeda8e604f96d9e09348ce8a5a1de0df35a56e3163bf8bd94460a4"} Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.523452 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.620905 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-utilities\") pod \"fbe83c1b-c285-4900-bef8-32038634713e\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.620949 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwkpr\" (UniqueName: \"kubernetes.io/projected/fbe83c1b-c285-4900-bef8-32038634713e-kube-api-access-rwkpr\") pod \"fbe83c1b-c285-4900-bef8-32038634713e\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.621039 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-catalog-content\") pod \"fbe83c1b-c285-4900-bef8-32038634713e\" (UID: \"fbe83c1b-c285-4900-bef8-32038634713e\") " Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.622948 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-utilities" (OuterVolumeSpecName: "utilities") pod "fbe83c1b-c285-4900-bef8-32038634713e" (UID: "fbe83c1b-c285-4900-bef8-32038634713e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.634503 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe83c1b-c285-4900-bef8-32038634713e-kube-api-access-rwkpr" (OuterVolumeSpecName: "kube-api-access-rwkpr") pod "fbe83c1b-c285-4900-bef8-32038634713e" (UID: "fbe83c1b-c285-4900-bef8-32038634713e"). InnerVolumeSpecName "kube-api-access-rwkpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.654465 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbe83c1b-c285-4900-bef8-32038634713e" (UID: "fbe83c1b-c285-4900-bef8-32038634713e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.722891 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.722927 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwkpr\" (UniqueName: \"kubernetes.io/projected/fbe83c1b-c285-4900-bef8-32038634713e-kube-api-access-rwkpr\") on node \"crc\" DevicePath \"\"" Dec 09 04:22:45 crc kubenswrapper[4766]: I1209 04:22:45.722945 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbe83c1b-c285-4900-bef8-32038634713e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:22:46 crc kubenswrapper[4766]: I1209 04:22:46.335185 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4b62h" event={"ID":"fbe83c1b-c285-4900-bef8-32038634713e","Type":"ContainerDied","Data":"1e7c4539603e96b91736a4e8c80387e47bf4b85fe1dfe7eaada03ce517314981"} Dec 09 04:22:46 crc kubenswrapper[4766]: I1209 04:22:46.335298 4766 scope.go:117] "RemoveContainer" containerID="6e182c97fcdeda8e604f96d9e09348ce8a5a1de0df35a56e3163bf8bd94460a4" Dec 09 04:22:46 crc kubenswrapper[4766]: I1209 04:22:46.336142 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4b62h" Dec 09 04:22:46 crc kubenswrapper[4766]: I1209 04:22:46.369937 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4b62h"] Dec 09 04:22:46 crc kubenswrapper[4766]: I1209 04:22:46.375318 4766 scope.go:117] "RemoveContainer" containerID="fc40ba16743d3c20789eed9c73e9d27dfff4f8df4f52eee8fbe3692079e13f00" Dec 09 04:22:46 crc kubenswrapper[4766]: I1209 04:22:46.376270 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4b62h"] Dec 09 04:22:46 crc kubenswrapper[4766]: I1209 04:22:46.463366 4766 scope.go:117] "RemoveContainer" containerID="68474e04a5d18440d62b2ca97fba3cc99c473e6a4b3573c28329cd051f77d731" Dec 09 04:22:46 crc kubenswrapper[4766]: I1209 04:22:46.852500 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe83c1b-c285-4900-bef8-32038634713e" path="/var/lib/kubelet/pods/fbe83c1b-c285-4900-bef8-32038634713e/volumes" Dec 09 04:22:51 crc kubenswrapper[4766]: I1209 04:22:51.170476 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:51 crc kubenswrapper[4766]: I1209 04:22:51.225891 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:54 crc kubenswrapper[4766]: I1209 04:22:54.776448 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pfhtm"] Dec 09 04:22:54 crc kubenswrapper[4766]: I1209 04:22:54.777346 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pfhtm" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerName="registry-server" containerID="cri-o://b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f" gracePeriod=2 Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.190911 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.361245 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29xbm\" (UniqueName: \"kubernetes.io/projected/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-kube-api-access-29xbm\") pod \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.361300 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-utilities\") pod \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.361368 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-catalog-content\") pod \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\" (UID: \"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd\") " Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.362272 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-utilities" (OuterVolumeSpecName: "utilities") pod "a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" (UID: "a76da2ca-d2f3-426c-8ae2-e7bac5de78bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.369014 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-kube-api-access-29xbm" (OuterVolumeSpecName: "kube-api-access-29xbm") pod "a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" (UID: "a76da2ca-d2f3-426c-8ae2-e7bac5de78bd"). InnerVolumeSpecName "kube-api-access-29xbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.411435 4766 generic.go:334] "Generic (PLEG): container finished" podID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerID="b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f" exitCode=0 Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.411503 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfhtm" event={"ID":"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd","Type":"ContainerDied","Data":"b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f"} Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.411547 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfhtm" event={"ID":"a76da2ca-d2f3-426c-8ae2-e7bac5de78bd","Type":"ContainerDied","Data":"ec7c7e8ee386b01fb701bf895094daa53418cde80504f4c494839aae1dd80df4"} Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.411576 4766 scope.go:117] "RemoveContainer" containerID="b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.411748 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfhtm" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.434802 4766 scope.go:117] "RemoveContainer" containerID="1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.454805 4766 scope.go:117] "RemoveContainer" containerID="167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.463069 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29xbm\" (UniqueName: \"kubernetes.io/projected/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-kube-api-access-29xbm\") on node \"crc\" DevicePath \"\"" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.463106 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.478293 4766 scope.go:117] "RemoveContainer" containerID="b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f" Dec 09 04:22:55 crc kubenswrapper[4766]: E1209 04:22:55.478863 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f\": container with ID starting with b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f not found: ID does not exist" containerID="b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.478897 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f"} err="failed to get container status \"b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f\": rpc error: code = NotFound desc = could not find container \"b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f\": container with ID starting with b7176cd88724ca14a08c273834b1baf3ec52720e143ffb62e36de9ed36bae98f not found: ID does not exist" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.478924 4766 scope.go:117] "RemoveContainer" containerID="1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86" Dec 09 04:22:55 crc kubenswrapper[4766]: E1209 04:22:55.479227 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86\": container with ID starting with 1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86 not found: ID does not exist" containerID="1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.479283 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86"} err="failed to get container status \"1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86\": rpc error: code = NotFound desc = could not find container \"1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86\": container with ID starting with 1a44b1defe190df54065c596bdf092703453b7a9a77ce74d9b647dd35b69fe86 not found: ID does not exist" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.479321 4766 scope.go:117] "RemoveContainer" containerID="167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f" Dec 09 04:22:55 crc kubenswrapper[4766]: E1209 04:22:55.479552 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f\": container with ID starting with 167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f not found: ID does not exist" containerID="167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.479577 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f"} err="failed to get container status \"167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f\": rpc error: code = NotFound desc = could not find container \"167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f\": container with ID starting with 167b2c43db15fa2d9edfb0e996e425a3328da01b3abd544320000f673a39a99f not found: ID does not exist" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.481912 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" (UID: "a76da2ca-d2f3-426c-8ae2-e7bac5de78bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.564960 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.748345 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pfhtm"] Dec 09 04:22:55 crc kubenswrapper[4766]: I1209 04:22:55.755483 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pfhtm"] Dec 09 04:22:56 crc kubenswrapper[4766]: I1209 04:22:56.856400 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" path="/var/lib/kubelet/pods/a76da2ca-d2f3-426c-8ae2-e7bac5de78bd/volumes" Dec 09 04:23:07 crc kubenswrapper[4766]: I1209 04:23:07.316167 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:23:07 crc kubenswrapper[4766]: I1209 04:23:07.317586 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:23:07 crc kubenswrapper[4766]: I1209 04:23:07.317724 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:23:07 crc kubenswrapper[4766]: I1209 04:23:07.319040 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:23:07 crc kubenswrapper[4766]: I1209 04:23:07.319187 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" gracePeriod=600 Dec 09 04:23:07 crc kubenswrapper[4766]: E1209 04:23:07.456421 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:23:07 crc kubenswrapper[4766]: I1209 04:23:07.526097 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" exitCode=0 Dec 09 04:23:07 crc kubenswrapper[4766]: I1209 04:23:07.526143 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616"} Dec 09 04:23:07 crc kubenswrapper[4766]: I1209 04:23:07.526175 4766 scope.go:117] "RemoveContainer" containerID="c5a7e8c76f2e51d9c804a5f3e67b38e0c969ac446f32a8ed39903e76c208d54b" Dec 09 04:23:07 crc kubenswrapper[4766]: I1209 04:23:07.526892 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:23:07 crc kubenswrapper[4766]: E1209 04:23:07.527319 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:23:22 crc kubenswrapper[4766]: I1209 04:23:22.839731 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:23:22 crc kubenswrapper[4766]: E1209 04:23:22.840793 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:23:35 crc kubenswrapper[4766]: I1209 04:23:35.838827 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:23:35 crc kubenswrapper[4766]: E1209 04:23:35.839648 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:23:49 crc kubenswrapper[4766]: I1209 04:23:49.840071 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:23:49 crc kubenswrapper[4766]: E1209 04:23:49.841310 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:24:03 crc kubenswrapper[4766]: I1209 04:24:03.839408 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:24:03 crc kubenswrapper[4766]: E1209 04:24:03.840382 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:24:17 crc kubenswrapper[4766]: I1209 04:24:17.839072 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:24:17 crc kubenswrapper[4766]: E1209 04:24:17.839792 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:24:29 crc kubenswrapper[4766]: I1209 04:24:29.839326 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:24:29 crc kubenswrapper[4766]: E1209 04:24:29.840014 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:24:44 crc kubenswrapper[4766]: I1209 04:24:44.845299 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:24:44 crc kubenswrapper[4766]: E1209 04:24:44.851290 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:24:55 crc kubenswrapper[4766]: I1209 04:24:55.839385 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:24:55 crc kubenswrapper[4766]: E1209 04:24:55.840411 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:25:08 crc kubenswrapper[4766]: I1209 04:25:08.844704 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:25:08 crc kubenswrapper[4766]: E1209 04:25:08.845468 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:25:20 crc kubenswrapper[4766]: I1209 04:25:20.839374 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:25:20 crc kubenswrapper[4766]: E1209 04:25:20.840546 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:25:35 crc kubenswrapper[4766]: I1209 04:25:35.839428 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:25:35 crc kubenswrapper[4766]: E1209 04:25:35.840793 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:25:47 crc kubenswrapper[4766]: I1209 04:25:47.839477 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:25:47 crc kubenswrapper[4766]: E1209 04:25:47.841880 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:26:01 crc kubenswrapper[4766]: I1209 04:26:01.839546 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:26:01 crc kubenswrapper[4766]: E1209 04:26:01.840369 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:26:15 crc kubenswrapper[4766]: I1209 04:26:15.839499 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:26:15 crc kubenswrapper[4766]: E1209 04:26:15.840285 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:26:26 crc kubenswrapper[4766]: I1209 04:26:26.839395 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:26:26 crc kubenswrapper[4766]: E1209 04:26:26.840252 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:26:40 crc kubenswrapper[4766]: I1209 04:26:40.839649 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:26:40 crc kubenswrapper[4766]: E1209 04:26:40.840919 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:26:52 crc kubenswrapper[4766]: I1209 04:26:52.839861 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:26:52 crc kubenswrapper[4766]: E1209 04:26:52.841082 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:27:05 crc kubenswrapper[4766]: I1209 04:27:05.839836 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:27:05 crc kubenswrapper[4766]: E1209 04:27:05.841005 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:27:19 crc kubenswrapper[4766]: I1209 04:27:19.839302 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:27:19 crc kubenswrapper[4766]: E1209 04:27:19.840124 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:27:30 crc kubenswrapper[4766]: I1209 04:27:30.839748 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:27:30 crc kubenswrapper[4766]: E1209 04:27:30.840895 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:27:42 crc kubenswrapper[4766]: I1209 04:27:42.839432 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:27:42 crc kubenswrapper[4766]: E1209 04:27:42.840318 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:27:57 crc kubenswrapper[4766]: I1209 04:27:57.839720 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:27:57 crc kubenswrapper[4766]: E1209 04:27:57.840745 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:28:11 crc kubenswrapper[4766]: I1209 04:28:11.840039 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:28:12 crc kubenswrapper[4766]: I1209 04:28:12.244199 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"af95de5f173efefe23a3858a8ceadb3e71c7f893eebc5787426af7094bd6a972"} Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.197561 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp"] Dec 09 04:30:00 crc kubenswrapper[4766]: E1209 04:30:00.198521 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe83c1b-c285-4900-bef8-32038634713e" containerName="extract-content" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.198545 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe83c1b-c285-4900-bef8-32038634713e" containerName="extract-content" Dec 09 04:30:00 crc kubenswrapper[4766]: E1209 04:30:00.198572 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe83c1b-c285-4900-bef8-32038634713e" containerName="registry-server" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.198580 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe83c1b-c285-4900-bef8-32038634713e" containerName="registry-server" Dec 09 04:30:00 crc kubenswrapper[4766]: E1209 04:30:00.198594 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe83c1b-c285-4900-bef8-32038634713e" containerName="extract-utilities" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.198602 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe83c1b-c285-4900-bef8-32038634713e" containerName="extract-utilities" Dec 09 04:30:00 crc kubenswrapper[4766]: E1209 04:30:00.198620 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerName="extract-utilities" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.198627 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerName="extract-utilities" Dec 09 04:30:00 crc kubenswrapper[4766]: E1209 04:30:00.198638 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerName="registry-server" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.198645 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerName="registry-server" Dec 09 04:30:00 crc kubenswrapper[4766]: E1209 04:30:00.198659 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerName="extract-content" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.198670 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerName="extract-content" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.198860 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a76da2ca-d2f3-426c-8ae2-e7bac5de78bd" containerName="registry-server" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.198884 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe83c1b-c285-4900-bef8-32038634713e" containerName="registry-server" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.199813 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.202667 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-secret-volume\") pod \"collect-profiles-29420910-qn6pp\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.202689 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.202714 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zczxc\" (UniqueName: \"kubernetes.io/projected/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-kube-api-access-zczxc\") pod \"collect-profiles-29420910-qn6pp\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.202761 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-config-volume\") pod \"collect-profiles-29420910-qn6pp\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.204667 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.210603 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp"] Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.304421 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-secret-volume\") pod \"collect-profiles-29420910-qn6pp\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.304527 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zczxc\" (UniqueName: \"kubernetes.io/projected/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-kube-api-access-zczxc\") pod \"collect-profiles-29420910-qn6pp\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.304603 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-config-volume\") pod \"collect-profiles-29420910-qn6pp\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.305752 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-config-volume\") pod \"collect-profiles-29420910-qn6pp\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.314631 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-secret-volume\") pod \"collect-profiles-29420910-qn6pp\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.321966 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zczxc\" (UniqueName: \"kubernetes.io/projected/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-kube-api-access-zczxc\") pod \"collect-profiles-29420910-qn6pp\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.519558 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:00 crc kubenswrapper[4766]: I1209 04:30:00.962099 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp"] Dec 09 04:30:01 crc kubenswrapper[4766]: I1209 04:30:01.692636 4766 generic.go:334] "Generic (PLEG): container finished" podID="7d6bf2ea-ab41-4d7c-915c-173cccd5043a" containerID="2081b0b7b3b77dfd02bcb3b144ddc1644d494418c760cd67ab818e150e85083b" exitCode=0 Dec 09 04:30:01 crc kubenswrapper[4766]: I1209 04:30:01.692721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" event={"ID":"7d6bf2ea-ab41-4d7c-915c-173cccd5043a","Type":"ContainerDied","Data":"2081b0b7b3b77dfd02bcb3b144ddc1644d494418c760cd67ab818e150e85083b"} Dec 09 04:30:01 crc kubenswrapper[4766]: I1209 04:30:01.692970 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" event={"ID":"7d6bf2ea-ab41-4d7c-915c-173cccd5043a","Type":"ContainerStarted","Data":"8ab94edfbd40faead4293ab94253782b7b7484ed5a71dcadb779812eef19782f"} Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.060757 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.245678 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zczxc\" (UniqueName: \"kubernetes.io/projected/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-kube-api-access-zczxc\") pod \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.245766 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-config-volume\") pod \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.245927 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-secret-volume\") pod \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\" (UID: \"7d6bf2ea-ab41-4d7c-915c-173cccd5043a\") " Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.247129 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d6bf2ea-ab41-4d7c-915c-173cccd5043a" (UID: "7d6bf2ea-ab41-4d7c-915c-173cccd5043a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.254401 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-kube-api-access-zczxc" (OuterVolumeSpecName: "kube-api-access-zczxc") pod "7d6bf2ea-ab41-4d7c-915c-173cccd5043a" (UID: "7d6bf2ea-ab41-4d7c-915c-173cccd5043a"). InnerVolumeSpecName "kube-api-access-zczxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.347620 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.347657 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zczxc\" (UniqueName: \"kubernetes.io/projected/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-kube-api-access-zczxc\") on node \"crc\" DevicePath \"\"" Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.555439 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d6bf2ea-ab41-4d7c-915c-173cccd5043a" (UID: "7d6bf2ea-ab41-4d7c-915c-173cccd5043a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.651560 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d6bf2ea-ab41-4d7c-915c-173cccd5043a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.711294 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" event={"ID":"7d6bf2ea-ab41-4d7c-915c-173cccd5043a","Type":"ContainerDied","Data":"8ab94edfbd40faead4293ab94253782b7b7484ed5a71dcadb779812eef19782f"} Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.711335 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ab94edfbd40faead4293ab94253782b7b7484ed5a71dcadb779812eef19782f" Dec 09 04:30:03 crc kubenswrapper[4766]: I1209 04:30:03.711383 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp" Dec 09 04:30:04 crc kubenswrapper[4766]: I1209 04:30:04.142643 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v"] Dec 09 04:30:04 crc kubenswrapper[4766]: I1209 04:30:04.149498 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420865-7959v"] Dec 09 04:30:04 crc kubenswrapper[4766]: I1209 04:30:04.853915 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="423a5f21-244b-4266-8d32-29468a4bf6f3" path="/var/lib/kubelet/pods/423a5f21-244b-4266-8d32-29468a4bf6f3/volumes" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.322813 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-2ms6v"] Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.331886 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-2ms6v"] Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.469993 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-249jk"] Dec 09 04:30:09 crc kubenswrapper[4766]: E1209 04:30:09.470601 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d6bf2ea-ab41-4d7c-915c-173cccd5043a" containerName="collect-profiles" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.470691 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d6bf2ea-ab41-4d7c-915c-173cccd5043a" containerName="collect-profiles" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.470941 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d6bf2ea-ab41-4d7c-915c-173cccd5043a" containerName="collect-profiles" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.471542 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.474039 4766 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xhpdj" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.474275 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.474402 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.474534 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.489809 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-249jk"] Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.636248 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cbfde79a-0681-42d5-bcee-38234466def9-crc-storage\") pod \"crc-storage-crc-249jk\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.636345 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t25tw\" (UniqueName: \"kubernetes.io/projected/cbfde79a-0681-42d5-bcee-38234466def9-kube-api-access-t25tw\") pod \"crc-storage-crc-249jk\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.636517 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cbfde79a-0681-42d5-bcee-38234466def9-node-mnt\") pod \"crc-storage-crc-249jk\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.738237 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t25tw\" (UniqueName: \"kubernetes.io/projected/cbfde79a-0681-42d5-bcee-38234466def9-kube-api-access-t25tw\") pod \"crc-storage-crc-249jk\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.738309 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cbfde79a-0681-42d5-bcee-38234466def9-node-mnt\") pod \"crc-storage-crc-249jk\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.738344 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cbfde79a-0681-42d5-bcee-38234466def9-crc-storage\") pod \"crc-storage-crc-249jk\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.738671 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cbfde79a-0681-42d5-bcee-38234466def9-node-mnt\") pod \"crc-storage-crc-249jk\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.739068 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cbfde79a-0681-42d5-bcee-38234466def9-crc-storage\") pod \"crc-storage-crc-249jk\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.763956 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t25tw\" (UniqueName: \"kubernetes.io/projected/cbfde79a-0681-42d5-bcee-38234466def9-kube-api-access-t25tw\") pod \"crc-storage-crc-249jk\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:09 crc kubenswrapper[4766]: I1209 04:30:09.791584 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:10 crc kubenswrapper[4766]: I1209 04:30:10.283460 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-249jk"] Dec 09 04:30:10 crc kubenswrapper[4766]: I1209 04:30:10.291541 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 04:30:10 crc kubenswrapper[4766]: I1209 04:30:10.777607 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-249jk" event={"ID":"cbfde79a-0681-42d5-bcee-38234466def9","Type":"ContainerStarted","Data":"29d994db1736652ba7716e77d6ef8e5bdc3a9d96b19573238ab7ff4a9194b7e7"} Dec 09 04:30:10 crc kubenswrapper[4766]: I1209 04:30:10.849522 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a7ab16b-3406-45b8-87bb-94799ff01d2e" path="/var/lib/kubelet/pods/7a7ab16b-3406-45b8-87bb-94799ff01d2e/volumes" Dec 09 04:30:11 crc kubenswrapper[4766]: I1209 04:30:11.804328 4766 generic.go:334] "Generic (PLEG): container finished" podID="cbfde79a-0681-42d5-bcee-38234466def9" containerID="1f4207a37955764890b3f2af640a5090f8ea902f8d4667ed6c87aa2742155685" exitCode=0 Dec 09 04:30:11 crc kubenswrapper[4766]: I1209 04:30:11.804700 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-249jk" event={"ID":"cbfde79a-0681-42d5-bcee-38234466def9","Type":"ContainerDied","Data":"1f4207a37955764890b3f2af640a5090f8ea902f8d4667ed6c87aa2742155685"} Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.130721 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.204383 4766 scope.go:117] "RemoveContainer" containerID="0000d767f3cf311aa03fa518b2584004cedcba666a2aaf2b35eb2ba28afeabe6" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.229116 4766 scope.go:117] "RemoveContainer" containerID="bdceedd8d8ebb9eb3cafcf36d93bd17206914478df573062d74cfd4a92b18e29" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.234341 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t25tw\" (UniqueName: \"kubernetes.io/projected/cbfde79a-0681-42d5-bcee-38234466def9-kube-api-access-t25tw\") pod \"cbfde79a-0681-42d5-bcee-38234466def9\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.234388 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cbfde79a-0681-42d5-bcee-38234466def9-node-mnt\") pod \"cbfde79a-0681-42d5-bcee-38234466def9\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.234445 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cbfde79a-0681-42d5-bcee-38234466def9-crc-storage\") pod \"cbfde79a-0681-42d5-bcee-38234466def9\" (UID: \"cbfde79a-0681-42d5-bcee-38234466def9\") " Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.234555 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbfde79a-0681-42d5-bcee-38234466def9-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "cbfde79a-0681-42d5-bcee-38234466def9" (UID: "cbfde79a-0681-42d5-bcee-38234466def9"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.234747 4766 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cbfde79a-0681-42d5-bcee-38234466def9-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.242241 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbfde79a-0681-42d5-bcee-38234466def9-kube-api-access-t25tw" (OuterVolumeSpecName: "kube-api-access-t25tw") pod "cbfde79a-0681-42d5-bcee-38234466def9" (UID: "cbfde79a-0681-42d5-bcee-38234466def9"). InnerVolumeSpecName "kube-api-access-t25tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.267971 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbfde79a-0681-42d5-bcee-38234466def9-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "cbfde79a-0681-42d5-bcee-38234466def9" (UID: "cbfde79a-0681-42d5-bcee-38234466def9"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.335438 4766 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cbfde79a-0681-42d5-bcee-38234466def9-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.335466 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t25tw\" (UniqueName: \"kubernetes.io/projected/cbfde79a-0681-42d5-bcee-38234466def9-kube-api-access-t25tw\") on node \"crc\" DevicePath \"\"" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.821620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-249jk" event={"ID":"cbfde79a-0681-42d5-bcee-38234466def9","Type":"ContainerDied","Data":"29d994db1736652ba7716e77d6ef8e5bdc3a9d96b19573238ab7ff4a9194b7e7"} Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.821656 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d994db1736652ba7716e77d6ef8e5bdc3a9d96b19573238ab7ff4a9194b7e7" Dec 09 04:30:13 crc kubenswrapper[4766]: I1209 04:30:13.821670 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-249jk" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.665944 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-249jk"] Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.675717 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-249jk"] Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.842742 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-p2c7f"] Dec 09 04:30:15 crc kubenswrapper[4766]: E1209 04:30:15.843102 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfde79a-0681-42d5-bcee-38234466def9" containerName="storage" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.843149 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfde79a-0681-42d5-bcee-38234466def9" containerName="storage" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.843401 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbfde79a-0681-42d5-bcee-38234466def9" containerName="storage" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.843935 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.845985 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.846262 4766 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-xhpdj" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.846343 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.847767 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.850557 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p2c7f"] Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.972778 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9504352-e557-43d2-81cc-2da34fb69872-crc-storage\") pod \"crc-storage-crc-p2c7f\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.972863 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2w5n\" (UniqueName: \"kubernetes.io/projected/b9504352-e557-43d2-81cc-2da34fb69872-kube-api-access-f2w5n\") pod \"crc-storage-crc-p2c7f\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:15 crc kubenswrapper[4766]: I1209 04:30:15.972901 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9504352-e557-43d2-81cc-2da34fb69872-node-mnt\") pod \"crc-storage-crc-p2c7f\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.073747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9504352-e557-43d2-81cc-2da34fb69872-node-mnt\") pod \"crc-storage-crc-p2c7f\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.073953 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9504352-e557-43d2-81cc-2da34fb69872-crc-storage\") pod \"crc-storage-crc-p2c7f\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.074105 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2w5n\" (UniqueName: \"kubernetes.io/projected/b9504352-e557-43d2-81cc-2da34fb69872-kube-api-access-f2w5n\") pod \"crc-storage-crc-p2c7f\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.075513 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9504352-e557-43d2-81cc-2da34fb69872-crc-storage\") pod \"crc-storage-crc-p2c7f\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.075700 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9504352-e557-43d2-81cc-2da34fb69872-node-mnt\") pod \"crc-storage-crc-p2c7f\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.108398 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2w5n\" (UniqueName: \"kubernetes.io/projected/b9504352-e557-43d2-81cc-2da34fb69872-kube-api-access-f2w5n\") pod \"crc-storage-crc-p2c7f\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.160986 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.662349 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p2c7f"] Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.856834 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbfde79a-0681-42d5-bcee-38234466def9" path="/var/lib/kubelet/pods/cbfde79a-0681-42d5-bcee-38234466def9/volumes" Dec 09 04:30:16 crc kubenswrapper[4766]: I1209 04:30:16.857998 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2c7f" event={"ID":"b9504352-e557-43d2-81cc-2da34fb69872","Type":"ContainerStarted","Data":"e409a187110edbb0bded7046a780d86753a90187c84fa5e535b810c4b2f898f7"} Dec 09 04:30:17 crc kubenswrapper[4766]: I1209 04:30:17.863097 4766 generic.go:334] "Generic (PLEG): container finished" podID="b9504352-e557-43d2-81cc-2da34fb69872" containerID="6740359c0cbf6385aa0534ee6b6c39e3bc53b15f947d1ced6d019c38083f1d10" exitCode=0 Dec 09 04:30:17 crc kubenswrapper[4766]: I1209 04:30:17.863208 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2c7f" event={"ID":"b9504352-e557-43d2-81cc-2da34fb69872","Type":"ContainerDied","Data":"6740359c0cbf6385aa0534ee6b6c39e3bc53b15f947d1ced6d019c38083f1d10"} Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.237782 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.320873 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2w5n\" (UniqueName: \"kubernetes.io/projected/b9504352-e557-43d2-81cc-2da34fb69872-kube-api-access-f2w5n\") pod \"b9504352-e557-43d2-81cc-2da34fb69872\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.321013 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9504352-e557-43d2-81cc-2da34fb69872-crc-storage\") pod \"b9504352-e557-43d2-81cc-2da34fb69872\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.321047 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9504352-e557-43d2-81cc-2da34fb69872-node-mnt\") pod \"b9504352-e557-43d2-81cc-2da34fb69872\" (UID: \"b9504352-e557-43d2-81cc-2da34fb69872\") " Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.321338 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9504352-e557-43d2-81cc-2da34fb69872-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b9504352-e557-43d2-81cc-2da34fb69872" (UID: "b9504352-e557-43d2-81cc-2da34fb69872"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.325342 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9504352-e557-43d2-81cc-2da34fb69872-kube-api-access-f2w5n" (OuterVolumeSpecName: "kube-api-access-f2w5n") pod "b9504352-e557-43d2-81cc-2da34fb69872" (UID: "b9504352-e557-43d2-81cc-2da34fb69872"). InnerVolumeSpecName "kube-api-access-f2w5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.340102 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9504352-e557-43d2-81cc-2da34fb69872-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b9504352-e557-43d2-81cc-2da34fb69872" (UID: "b9504352-e557-43d2-81cc-2da34fb69872"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.423327 4766 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9504352-e557-43d2-81cc-2da34fb69872-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.423394 4766 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9504352-e557-43d2-81cc-2da34fb69872-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.423411 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2w5n\" (UniqueName: \"kubernetes.io/projected/b9504352-e557-43d2-81cc-2da34fb69872-kube-api-access-f2w5n\") on node \"crc\" DevicePath \"\"" Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.886184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2c7f" event={"ID":"b9504352-e557-43d2-81cc-2da34fb69872","Type":"ContainerDied","Data":"e409a187110edbb0bded7046a780d86753a90187c84fa5e535b810c4b2f898f7"} Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.886250 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e409a187110edbb0bded7046a780d86753a90187c84fa5e535b810c4b2f898f7" Dec 09 04:30:19 crc kubenswrapper[4766]: I1209 04:30:19.886344 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2c7f" Dec 09 04:30:37 crc kubenswrapper[4766]: I1209 04:30:37.316183 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:30:37 crc kubenswrapper[4766]: I1209 04:30:37.317011 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:31:07 crc kubenswrapper[4766]: I1209 04:31:07.317287 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:31:07 crc kubenswrapper[4766]: I1209 04:31:07.318043 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:31:37 crc kubenswrapper[4766]: I1209 04:31:37.316801 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:31:37 crc kubenswrapper[4766]: I1209 04:31:37.317711 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:31:37 crc kubenswrapper[4766]: I1209 04:31:37.317801 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:31:37 crc kubenswrapper[4766]: I1209 04:31:37.319054 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af95de5f173efefe23a3858a8ceadb3e71c7f893eebc5787426af7094bd6a972"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:31:37 crc kubenswrapper[4766]: I1209 04:31:37.319195 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://af95de5f173efefe23a3858a8ceadb3e71c7f893eebc5787426af7094bd6a972" gracePeriod=600 Dec 09 04:31:37 crc kubenswrapper[4766]: I1209 04:31:37.548619 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="af95de5f173efefe23a3858a8ceadb3e71c7f893eebc5787426af7094bd6a972" exitCode=0 Dec 09 04:31:37 crc kubenswrapper[4766]: I1209 04:31:37.548673 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"af95de5f173efefe23a3858a8ceadb3e71c7f893eebc5787426af7094bd6a972"} Dec 09 04:31:37 crc kubenswrapper[4766]: I1209 04:31:37.548719 4766 scope.go:117] "RemoveContainer" containerID="66364b9bdf6784acb69ce7c7b3742ba76502977f6c43d77878a21b8ac666f616" Dec 09 04:31:38 crc kubenswrapper[4766]: I1209 04:31:38.561390 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9"} Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.121675 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-29nsh"] Dec 09 04:32:58 crc kubenswrapper[4766]: E1209 04:32:58.128190 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9504352-e557-43d2-81cc-2da34fb69872" containerName="storage" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.128342 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9504352-e557-43d2-81cc-2da34fb69872" containerName="storage" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.128627 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9504352-e557-43d2-81cc-2da34fb69872" containerName="storage" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.129637 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.139659 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-29nsh"] Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.169472 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-catalog-content\") pod \"redhat-operators-29nsh\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.169607 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7lxn\" (UniqueName: \"kubernetes.io/projected/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-kube-api-access-r7lxn\") pod \"redhat-operators-29nsh\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.169713 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-utilities\") pod \"redhat-operators-29nsh\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.271083 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-catalog-content\") pod \"redhat-operators-29nsh\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.271142 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7lxn\" (UniqueName: \"kubernetes.io/projected/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-kube-api-access-r7lxn\") pod \"redhat-operators-29nsh\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.271180 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-utilities\") pod \"redhat-operators-29nsh\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.271749 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-utilities\") pod \"redhat-operators-29nsh\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.271891 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-catalog-content\") pod \"redhat-operators-29nsh\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.296514 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7lxn\" (UniqueName: \"kubernetes.io/projected/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-kube-api-access-r7lxn\") pod \"redhat-operators-29nsh\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.511466 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:32:58 crc kubenswrapper[4766]: I1209 04:32:58.954991 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-29nsh"] Dec 09 04:32:59 crc kubenswrapper[4766]: I1209 04:32:59.257519 4766 generic.go:334] "Generic (PLEG): container finished" podID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerID="ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6" exitCode=0 Dec 09 04:32:59 crc kubenswrapper[4766]: I1209 04:32:59.257592 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-29nsh" event={"ID":"b6d37fb4-537d-4edd-abf6-b55ea9455ff6","Type":"ContainerDied","Data":"ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6"} Dec 09 04:32:59 crc kubenswrapper[4766]: I1209 04:32:59.257660 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-29nsh" event={"ID":"b6d37fb4-537d-4edd-abf6-b55ea9455ff6","Type":"ContainerStarted","Data":"cab4facd2ec0539875e0e1f04171fb98a896b4e45bbac48026df78bc176e2460"} Dec 09 04:33:00 crc kubenswrapper[4766]: I1209 04:33:00.268683 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-29nsh" event={"ID":"b6d37fb4-537d-4edd-abf6-b55ea9455ff6","Type":"ContainerStarted","Data":"3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525"} Dec 09 04:33:01 crc kubenswrapper[4766]: I1209 04:33:01.286478 4766 generic.go:334] "Generic (PLEG): container finished" podID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerID="3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525" exitCode=0 Dec 09 04:33:01 crc kubenswrapper[4766]: I1209 04:33:01.286542 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-29nsh" event={"ID":"b6d37fb4-537d-4edd-abf6-b55ea9455ff6","Type":"ContainerDied","Data":"3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525"} Dec 09 04:33:02 crc kubenswrapper[4766]: I1209 04:33:02.297382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-29nsh" event={"ID":"b6d37fb4-537d-4edd-abf6-b55ea9455ff6","Type":"ContainerStarted","Data":"d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0"} Dec 09 04:33:02 crc kubenswrapper[4766]: I1209 04:33:02.320023 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-29nsh" podStartSLOduration=1.7776677250000001 podStartE2EDuration="4.32000328s" podCreationTimestamp="2025-12-09 04:32:58 +0000 UTC" firstStartedPulling="2025-12-09 04:32:59.25883426 +0000 UTC m=+4860.968139686" lastFinishedPulling="2025-12-09 04:33:01.801169775 +0000 UTC m=+4863.510475241" observedRunningTime="2025-12-09 04:33:02.310732149 +0000 UTC m=+4864.020037585" watchObservedRunningTime="2025-12-09 04:33:02.32000328 +0000 UTC m=+4864.029308716" Dec 09 04:33:08 crc kubenswrapper[4766]: I1209 04:33:08.512436 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:33:08 crc kubenswrapper[4766]: I1209 04:33:08.513100 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:33:08 crc kubenswrapper[4766]: I1209 04:33:08.568738 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:33:09 crc kubenswrapper[4766]: I1209 04:33:09.424884 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:33:09 crc kubenswrapper[4766]: I1209 04:33:09.480029 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-29nsh"] Dec 09 04:33:11 crc kubenswrapper[4766]: I1209 04:33:11.363605 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-29nsh" podUID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerName="registry-server" containerID="cri-o://d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0" gracePeriod=2 Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.016020 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.180452 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7lxn\" (UniqueName: \"kubernetes.io/projected/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-kube-api-access-r7lxn\") pod \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.180778 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-catalog-content\") pod \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.180933 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-utilities\") pod \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\" (UID: \"b6d37fb4-537d-4edd-abf6-b55ea9455ff6\") " Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.181687 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-utilities" (OuterVolumeSpecName: "utilities") pod "b6d37fb4-537d-4edd-abf6-b55ea9455ff6" (UID: "b6d37fb4-537d-4edd-abf6-b55ea9455ff6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.187873 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-kube-api-access-r7lxn" (OuterVolumeSpecName: "kube-api-access-r7lxn") pod "b6d37fb4-537d-4edd-abf6-b55ea9455ff6" (UID: "b6d37fb4-537d-4edd-abf6-b55ea9455ff6"). InnerVolumeSpecName "kube-api-access-r7lxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.283519 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7lxn\" (UniqueName: \"kubernetes.io/projected/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-kube-api-access-r7lxn\") on node \"crc\" DevicePath \"\"" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.283599 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.322771 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6d37fb4-537d-4edd-abf6-b55ea9455ff6" (UID: "b6d37fb4-537d-4edd-abf6-b55ea9455ff6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.384932 4766 generic.go:334] "Generic (PLEG): container finished" podID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerID="d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0" exitCode=0 Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.385007 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-29nsh" event={"ID":"b6d37fb4-537d-4edd-abf6-b55ea9455ff6","Type":"ContainerDied","Data":"d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0"} Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.385056 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-29nsh" event={"ID":"b6d37fb4-537d-4edd-abf6-b55ea9455ff6","Type":"ContainerDied","Data":"cab4facd2ec0539875e0e1f04171fb98a896b4e45bbac48026df78bc176e2460"} Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.385094 4766 scope.go:117] "RemoveContainer" containerID="d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.385174 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d37fb4-537d-4edd-abf6-b55ea9455ff6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.385327 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-29nsh" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.407010 4766 scope.go:117] "RemoveContainer" containerID="3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.431173 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-29nsh"] Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.435046 4766 scope.go:117] "RemoveContainer" containerID="ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.445145 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-29nsh"] Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.469450 4766 scope.go:117] "RemoveContainer" containerID="d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0" Dec 09 04:33:13 crc kubenswrapper[4766]: E1209 04:33:13.469810 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0\": container with ID starting with d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0 not found: ID does not exist" containerID="d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.469846 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0"} err="failed to get container status \"d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0\": rpc error: code = NotFound desc = could not find container \"d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0\": container with ID starting with d54b0c661b03802f14b0df77a86c562b31b49a933624d867f7f5cf147850e2d0 not found: ID does not exist" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.469873 4766 scope.go:117] "RemoveContainer" containerID="3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525" Dec 09 04:33:13 crc kubenswrapper[4766]: E1209 04:33:13.470151 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525\": container with ID starting with 3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525 not found: ID does not exist" containerID="3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.470181 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525"} err="failed to get container status \"3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525\": rpc error: code = NotFound desc = could not find container \"3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525\": container with ID starting with 3e9bf9c042ab64bb2b17de2a5c2522e360599f61dfd1720a7d93f233b949e525 not found: ID does not exist" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.470201 4766 scope.go:117] "RemoveContainer" containerID="ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6" Dec 09 04:33:13 crc kubenswrapper[4766]: E1209 04:33:13.470502 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6\": container with ID starting with ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6 not found: ID does not exist" containerID="ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6" Dec 09 04:33:13 crc kubenswrapper[4766]: I1209 04:33:13.470531 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6"} err="failed to get container status \"ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6\": rpc error: code = NotFound desc = could not find container \"ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6\": container with ID starting with ad70de5f50c560a23617291597b8614c4123450265412971744cc34b70cf1db6 not found: ID does not exist" Dec 09 04:33:14 crc kubenswrapper[4766]: I1209 04:33:14.848825 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" path="/var/lib/kubelet/pods/b6d37fb4-537d-4edd-abf6-b55ea9455ff6/volumes" Dec 09 04:33:37 crc kubenswrapper[4766]: I1209 04:33:37.316847 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:33:37 crc kubenswrapper[4766]: I1209 04:33:37.317452 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.675048 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-9qcm6"] Dec 09 04:33:51 crc kubenswrapper[4766]: E1209 04:33:51.675976 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerName="extract-utilities" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.675994 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerName="extract-utilities" Dec 09 04:33:51 crc kubenswrapper[4766]: E1209 04:33:51.676018 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerName="extract-content" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.676025 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerName="extract-content" Dec 09 04:33:51 crc kubenswrapper[4766]: E1209 04:33:51.676045 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerName="registry-server" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.676053 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerName="registry-server" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.676235 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d37fb4-537d-4edd-abf6-b55ea9455ff6" containerName="registry-server" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.677124 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.678688 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.678892 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lpzqm" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.679294 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.679514 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.679540 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.698964 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-9qcm6"] Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.748344 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-config\") pod \"dnsmasq-dns-5d7b5456f5-9qcm6\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.748389 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-9qcm6\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.748574 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvhc\" (UniqueName: \"kubernetes.io/projected/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-kube-api-access-ldvhc\") pod \"dnsmasq-dns-5d7b5456f5-9qcm6\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.849869 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvhc\" (UniqueName: \"kubernetes.io/projected/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-kube-api-access-ldvhc\") pod \"dnsmasq-dns-5d7b5456f5-9qcm6\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.849947 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-config\") pod \"dnsmasq-dns-5d7b5456f5-9qcm6\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.849979 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-9qcm6\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.852385 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-config\") pod \"dnsmasq-dns-5d7b5456f5-9qcm6\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.852421 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-9qcm6\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.878468 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvhc\" (UniqueName: \"kubernetes.io/projected/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-kube-api-access-ldvhc\") pod \"dnsmasq-dns-5d7b5456f5-9qcm6\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.954514 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-kfgb8"] Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.955821 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:51 crc kubenswrapper[4766]: I1209 04:33:51.967458 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-kfgb8"] Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:51.996000 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.054801 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-kfgb8\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.054890 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-config\") pod \"dnsmasq-dns-98ddfc8f-kfgb8\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.054985 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9gx9\" (UniqueName: \"kubernetes.io/projected/29311969-a272-4017-b1a6-be36bf991edd-kube-api-access-w9gx9\") pod \"dnsmasq-dns-98ddfc8f-kfgb8\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.164111 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-config\") pod \"dnsmasq-dns-98ddfc8f-kfgb8\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.164200 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9gx9\" (UniqueName: \"kubernetes.io/projected/29311969-a272-4017-b1a6-be36bf991edd-kube-api-access-w9gx9\") pod \"dnsmasq-dns-98ddfc8f-kfgb8\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.164274 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-kfgb8\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.165318 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-kfgb8\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.165645 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-config\") pod \"dnsmasq-dns-98ddfc8f-kfgb8\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.194117 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9gx9\" (UniqueName: \"kubernetes.io/projected/29311969-a272-4017-b1a6-be36bf991edd-kube-api-access-w9gx9\") pod \"dnsmasq-dns-98ddfc8f-kfgb8\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.271551 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.369288 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-9qcm6"] Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.705737 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" containerID="9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56" exitCode=0 Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.705778 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" event={"ID":"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed","Type":"ContainerDied","Data":"9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56"} Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.705802 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" event={"ID":"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed","Type":"ContainerStarted","Data":"8859b667d4b7f06031155812bea390242f777440d3648fb6d2dadb899ac85004"} Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.767067 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-kfgb8"] Dec 09 04:33:52 crc kubenswrapper[4766]: W1209 04:33:52.776265 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29311969_a272_4017_b1a6_be36bf991edd.slice/crio-4d97767b4b3d0a9851902eb1b80dee93f4a3dd82287aabd9be478c34ce7e3cfb WatchSource:0}: Error finding container 4d97767b4b3d0a9851902eb1b80dee93f4a3dd82287aabd9be478c34ce7e3cfb: Status 404 returned error can't find the container with id 4d97767b4b3d0a9851902eb1b80dee93f4a3dd82287aabd9be478c34ce7e3cfb Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.858421 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.860015 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.860142 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.861870 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.861910 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.861873 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.862090 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pr6vt" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.862370 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.977430 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7688639-6791-4900-8a65-849e5e501fda-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.978309 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.978482 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.978539 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.978580 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.978638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7688639-6791-4900-8a65-849e5e501fda-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.978708 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.978737 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lc7\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-kube-api-access-b9lc7\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:52 crc kubenswrapper[4766]: I1209 04:33:52.979421 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: E1209 04:33:53.061331 4766 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 09 04:33:53 crc kubenswrapper[4766]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 09 04:33:53 crc kubenswrapper[4766]: > podSandboxID="8859b667d4b7f06031155812bea390242f777440d3648fb6d2dadb899ac85004" Dec 09 04:33:53 crc kubenswrapper[4766]: E1209 04:33:53.061546 4766 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 09 04:33:53 crc kubenswrapper[4766]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldvhc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d7b5456f5-9qcm6_openstack(2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 09 04:33:53 crc kubenswrapper[4766]: > logger="UnhandledError" Dec 09 04:33:53 crc kubenswrapper[4766]: E1209 04:33:53.062754 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" podUID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081059 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7688639-6791-4900-8a65-849e5e501fda-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081118 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081164 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081201 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081256 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081277 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7688639-6791-4900-8a65-849e5e501fda-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081327 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081351 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lc7\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-kube-api-access-b9lc7\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081388 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081915 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.081955 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.083012 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.083169 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.087163 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.087233 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d35f51cb7ce9e67be7893575eb60711ab0d51a89814f019d626546810c453818/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.087418 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7688639-6791-4900-8a65-849e5e501fda-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.087717 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.088029 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7688639-6791-4900-8a65-849e5e501fda-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.105113 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lc7\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-kube-api-access-b9lc7\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.123070 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") pod \"rabbitmq-server-0\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.128061 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.129289 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.130899 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.131233 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.133394 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-85shk" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.134750 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.134888 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.169546 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.209047 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.284157 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29z7\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-kube-api-access-j29z7\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.284473 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.284503 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.284549 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e475cce-e1a9-48d1-aede-42163597ab9f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.284586 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.284606 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.284647 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e475cce-e1a9-48d1-aede-42163597ab9f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.284684 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.284700 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.386488 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j29z7\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-kube-api-access-j29z7\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.386554 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.386593 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.386633 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e475cce-e1a9-48d1-aede-42163597ab9f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.386671 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.386695 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.386728 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e475cce-e1a9-48d1-aede-42163597ab9f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.386758 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.386781 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.389142 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.389541 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.390151 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.390669 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.393370 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e475cce-e1a9-48d1-aede-42163597ab9f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.393600 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e475cce-e1a9-48d1-aede-42163597ab9f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.394355 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.398243 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.398268 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f2701008402cad3cf1a00c3b11fcd7bdf81ee0cb8484f7efc79ac3b40f431cba/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.410729 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29z7\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-kube-api-access-j29z7\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.431060 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.436655 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.485892 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.715415 4766 generic.go:334] "Generic (PLEG): container finished" podID="29311969-a272-4017-b1a6-be36bf991edd" containerID="ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88" exitCode=0 Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.715777 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" event={"ID":"29311969-a272-4017-b1a6-be36bf991edd","Type":"ContainerDied","Data":"ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88"} Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.715811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" event={"ID":"29311969-a272-4017-b1a6-be36bf991edd","Type":"ContainerStarted","Data":"4d97767b4b3d0a9851902eb1b80dee93f4a3dd82287aabd9be478c34ce7e3cfb"} Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.717889 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7688639-6791-4900-8a65-849e5e501fda","Type":"ContainerStarted","Data":"643e8c66c20004b284cb2343923dd5fb0f25f90c2e0a3fd64d8d943ba77fcfc1"} Dec 09 04:33:53 crc kubenswrapper[4766]: I1209 04:33:53.915844 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 04:33:53 crc kubenswrapper[4766]: W1209 04:33:53.918598 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e475cce_e1a9_48d1_aede_42163597ab9f.slice/crio-19d4af020a5fb9af0c11daa06272e00327faed933b2fd6912965deca065390b4 WatchSource:0}: Error finding container 19d4af020a5fb9af0c11daa06272e00327faed933b2fd6912965deca065390b4: Status 404 returned error can't find the container with id 19d4af020a5fb9af0c11daa06272e00327faed933b2fd6912965deca065390b4 Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.383527 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.386706 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.391046 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.391292 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-mv4l5" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.391292 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.392705 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.398016 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.400306 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.513600 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/970b977f-fe90-4458-9acd-bef89b225b9d-kolla-config\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.513653 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/970b977f-fe90-4458-9acd-bef89b225b9d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.513676 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970b977f-fe90-4458-9acd-bef89b225b9d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.513711 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8bf2df2b-9a20-4ba5-b377-10638c750465\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bf2df2b-9a20-4ba5-b377-10638c750465\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.513753 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/970b977f-fe90-4458-9acd-bef89b225b9d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.513775 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9x5\" (UniqueName: \"kubernetes.io/projected/970b977f-fe90-4458-9acd-bef89b225b9d-kube-api-access-bh9x5\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.513807 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/970b977f-fe90-4458-9acd-bef89b225b9d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.513853 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/970b977f-fe90-4458-9acd-bef89b225b9d-config-data-default\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.615423 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/970b977f-fe90-4458-9acd-bef89b225b9d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.615684 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/970b977f-fe90-4458-9acd-bef89b225b9d-config-data-default\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.615814 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/970b977f-fe90-4458-9acd-bef89b225b9d-kolla-config\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.616008 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/970b977f-fe90-4458-9acd-bef89b225b9d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.616488 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970b977f-fe90-4458-9acd-bef89b225b9d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.616939 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8bf2df2b-9a20-4ba5-b377-10638c750465\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bf2df2b-9a20-4ba5-b377-10638c750465\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.617062 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/970b977f-fe90-4458-9acd-bef89b225b9d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.617233 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh9x5\" (UniqueName: \"kubernetes.io/projected/970b977f-fe90-4458-9acd-bef89b225b9d-kube-api-access-bh9x5\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.616447 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/970b977f-fe90-4458-9acd-bef89b225b9d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.616775 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/970b977f-fe90-4458-9acd-bef89b225b9d-config-data-default\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.616790 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/970b977f-fe90-4458-9acd-bef89b225b9d-kolla-config\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.618154 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/970b977f-fe90-4458-9acd-bef89b225b9d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.619867 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.619908 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8bf2df2b-9a20-4ba5-b377-10638c750465\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bf2df2b-9a20-4ba5-b377-10638c750465\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8dace88f6426f330cf91b8f7eb54d35027d6e6cd4ad3123949b9e79df90b8323/globalmount\"" pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.724816 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e475cce-e1a9-48d1-aede-42163597ab9f","Type":"ContainerStarted","Data":"19d4af020a5fb9af0c11daa06272e00327faed933b2fd6912965deca065390b4"} Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.726889 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" event={"ID":"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed","Type":"ContainerStarted","Data":"a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f"} Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.727112 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.728518 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" event={"ID":"29311969-a272-4017-b1a6-be36bf991edd","Type":"ContainerStarted","Data":"80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416"} Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.728646 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.750373 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.751854 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.754532 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.756740 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/970b977f-fe90-4458-9acd-bef89b225b9d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.758426 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qdchg" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.768006 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/970b977f-fe90-4458-9acd-bef89b225b9d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.769156 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh9x5\" (UniqueName: \"kubernetes.io/projected/970b977f-fe90-4458-9acd-bef89b225b9d-kube-api-access-bh9x5\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.772168 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" podStartSLOduration=3.772143586 podStartE2EDuration="3.772143586s" podCreationTimestamp="2025-12-09 04:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:33:54.766569435 +0000 UTC m=+4916.475874861" watchObservedRunningTime="2025-12-09 04:33:54.772143586 +0000 UTC m=+4916.481449012" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.785777 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.803352 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" podStartSLOduration=3.80333233 podStartE2EDuration="3.80333233s" podCreationTimestamp="2025-12-09 04:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:33:54.795459538 +0000 UTC m=+4916.504764964" watchObservedRunningTime="2025-12-09 04:33:54.80333233 +0000 UTC m=+4916.512637756" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.886758 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8bf2df2b-9a20-4ba5-b377-10638c750465\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8bf2df2b-9a20-4ba5-b377-10638c750465\") pod \"openstack-galera-0\" (UID: \"970b977f-fe90-4458-9acd-bef89b225b9d\") " pod="openstack/openstack-galera-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.926591 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4e13dec0-38aa-4f9f-944a-f96185e640eb-kolla-config\") pod \"memcached-0\" (UID: \"4e13dec0-38aa-4f9f-944a-f96185e640eb\") " pod="openstack/memcached-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.926767 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv2r8\" (UniqueName: \"kubernetes.io/projected/4e13dec0-38aa-4f9f-944a-f96185e640eb-kube-api-access-rv2r8\") pod \"memcached-0\" (UID: \"4e13dec0-38aa-4f9f-944a-f96185e640eb\") " pod="openstack/memcached-0" Dec 09 04:33:54 crc kubenswrapper[4766]: I1209 04:33:54.926879 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e13dec0-38aa-4f9f-944a-f96185e640eb-config-data\") pod \"memcached-0\" (UID: \"4e13dec0-38aa-4f9f-944a-f96185e640eb\") " pod="openstack/memcached-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.012810 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.028792 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e13dec0-38aa-4f9f-944a-f96185e640eb-config-data\") pod \"memcached-0\" (UID: \"4e13dec0-38aa-4f9f-944a-f96185e640eb\") " pod="openstack/memcached-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.028877 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4e13dec0-38aa-4f9f-944a-f96185e640eb-kolla-config\") pod \"memcached-0\" (UID: \"4e13dec0-38aa-4f9f-944a-f96185e640eb\") " pod="openstack/memcached-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.028932 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv2r8\" (UniqueName: \"kubernetes.io/projected/4e13dec0-38aa-4f9f-944a-f96185e640eb-kube-api-access-rv2r8\") pod \"memcached-0\" (UID: \"4e13dec0-38aa-4f9f-944a-f96185e640eb\") " pod="openstack/memcached-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.030379 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4e13dec0-38aa-4f9f-944a-f96185e640eb-kolla-config\") pod \"memcached-0\" (UID: \"4e13dec0-38aa-4f9f-944a-f96185e640eb\") " pod="openstack/memcached-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.033853 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e13dec0-38aa-4f9f-944a-f96185e640eb-config-data\") pod \"memcached-0\" (UID: \"4e13dec0-38aa-4f9f-944a-f96185e640eb\") " pod="openstack/memcached-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.047791 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv2r8\" (UniqueName: \"kubernetes.io/projected/4e13dec0-38aa-4f9f-944a-f96185e640eb-kube-api-access-rv2r8\") pod \"memcached-0\" (UID: \"4e13dec0-38aa-4f9f-944a-f96185e640eb\") " pod="openstack/memcached-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.230020 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.482726 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 09 04:33:55 crc kubenswrapper[4766]: W1209 04:33:55.495706 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod970b977f_fe90_4458_9acd_bef89b225b9d.slice/crio-e429b997f3e85e4bd2bb95cc63d13594ef92150c2b89fbdb4b762914c6bd60c9 WatchSource:0}: Error finding container e429b997f3e85e4bd2bb95cc63d13594ef92150c2b89fbdb4b762914c6bd60c9: Status 404 returned error can't find the container with id e429b997f3e85e4bd2bb95cc63d13594ef92150c2b89fbdb4b762914c6bd60c9 Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.636107 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 09 04:33:55 crc kubenswrapper[4766]: W1209 04:33:55.639495 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e13dec0_38aa_4f9f_944a_f96185e640eb.slice/crio-6aba9fea2fad8e3584533a2a1cff47e69a0967d32d08cb84482d61e506220200 WatchSource:0}: Error finding container 6aba9fea2fad8e3584533a2a1cff47e69a0967d32d08cb84482d61e506220200: Status 404 returned error can't find the container with id 6aba9fea2fad8e3584533a2a1cff47e69a0967d32d08cb84482d61e506220200 Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.737664 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4e13dec0-38aa-4f9f-944a-f96185e640eb","Type":"ContainerStarted","Data":"6aba9fea2fad8e3584533a2a1cff47e69a0967d32d08cb84482d61e506220200"} Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.739562 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"970b977f-fe90-4458-9acd-bef89b225b9d","Type":"ContainerStarted","Data":"39b5cb85f7ab4c0504a296dd5e3e23be4c4d6844170bec3a462108842cba75cf"} Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.739591 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"970b977f-fe90-4458-9acd-bef89b225b9d","Type":"ContainerStarted","Data":"e429b997f3e85e4bd2bb95cc63d13594ef92150c2b89fbdb4b762914c6bd60c9"} Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.741894 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7688639-6791-4900-8a65-849e5e501fda","Type":"ContainerStarted","Data":"20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a"} Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.927451 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.928626 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.930513 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hwh25" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.931199 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.931270 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.932740 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 09 04:33:55 crc kubenswrapper[4766]: I1209 04:33:55.988131 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.042631 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.042738 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.042804 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rn58\" (UniqueName: \"kubernetes.io/projected/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-kube-api-access-6rn58\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.042865 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.043023 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.043458 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.043531 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.043582 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9e8abfdc-f45f-4ef2-80a3-fb5844d43d35\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e8abfdc-f45f-4ef2-80a3-fb5844d43d35\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.144636 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.144688 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.144716 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9e8abfdc-f45f-4ef2-80a3-fb5844d43d35\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e8abfdc-f45f-4ef2-80a3-fb5844d43d35\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.144749 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.144770 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.144788 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rn58\" (UniqueName: \"kubernetes.io/projected/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-kube-api-access-6rn58\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.145710 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.145750 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.146124 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.146402 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.146618 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.147521 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.149316 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.149382 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9e8abfdc-f45f-4ef2-80a3-fb5844d43d35\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e8abfdc-f45f-4ef2-80a3-fb5844d43d35\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1a11615ea471ffd4876b73d54361cc94e14fde271ec8544e41f4d0f7404f722a/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.258347 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.259203 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rn58\" (UniqueName: \"kubernetes.io/projected/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-kube-api-access-6rn58\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.260718 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2a4151-7484-4aad-8fc9-e7c7bbbb753c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.285454 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9e8abfdc-f45f-4ef2-80a3-fb5844d43d35\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e8abfdc-f45f-4ef2-80a3-fb5844d43d35\") pod \"openstack-cell1-galera-0\" (UID: \"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c\") " pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.547710 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.750451 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e475cce-e1a9-48d1-aede-42163597ab9f","Type":"ContainerStarted","Data":"a01bd4727041af3f878626add1757349ac196a31902d1ab24f3c042d331c70f0"} Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.752736 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4e13dec0-38aa-4f9f-944a-f96185e640eb","Type":"ContainerStarted","Data":"d0bf535785c75b65c24b8593e0334808359a439754b28e8cdbb00db344e0bf8f"} Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.755114 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.810148 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.810121878 podStartE2EDuration="2.810121878s" podCreationTimestamp="2025-12-09 04:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:33:56.79543227 +0000 UTC m=+4918.504737716" watchObservedRunningTime="2025-12-09 04:33:56.810121878 +0000 UTC m=+4918.519427344" Dec 09 04:33:56 crc kubenswrapper[4766]: I1209 04:33:56.982545 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 09 04:33:57 crc kubenswrapper[4766]: I1209 04:33:57.761847 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c","Type":"ContainerStarted","Data":"c83be406f687c23c179ad0ca522526414a2eb7490156f1e171efc09e24345ce1"} Dec 09 04:33:57 crc kubenswrapper[4766]: I1209 04:33:57.762256 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c","Type":"ContainerStarted","Data":"e9f519a7c964ba6386ebd2630bfa970f091942eccdfdc1b9dfc31c3a57e6524e"} Dec 09 04:33:59 crc kubenswrapper[4766]: I1209 04:33:59.776223 4766 generic.go:334] "Generic (PLEG): container finished" podID="970b977f-fe90-4458-9acd-bef89b225b9d" containerID="39b5cb85f7ab4c0504a296dd5e3e23be4c4d6844170bec3a462108842cba75cf" exitCode=0 Dec 09 04:33:59 crc kubenswrapper[4766]: I1209 04:33:59.776323 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"970b977f-fe90-4458-9acd-bef89b225b9d","Type":"ContainerDied","Data":"39b5cb85f7ab4c0504a296dd5e3e23be4c4d6844170bec3a462108842cba75cf"} Dec 09 04:34:00 crc kubenswrapper[4766]: I1209 04:34:00.231630 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 09 04:34:00 crc kubenswrapper[4766]: I1209 04:34:00.784773 4766 generic.go:334] "Generic (PLEG): container finished" podID="ad2a4151-7484-4aad-8fc9-e7c7bbbb753c" containerID="c83be406f687c23c179ad0ca522526414a2eb7490156f1e171efc09e24345ce1" exitCode=0 Dec 09 04:34:00 crc kubenswrapper[4766]: I1209 04:34:00.784847 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c","Type":"ContainerDied","Data":"c83be406f687c23c179ad0ca522526414a2eb7490156f1e171efc09e24345ce1"} Dec 09 04:34:00 crc kubenswrapper[4766]: I1209 04:34:00.786915 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"970b977f-fe90-4458-9acd-bef89b225b9d","Type":"ContainerStarted","Data":"d751f6c09051f6d350e2a37d7a450a2d74179e60f78b1db36deb5f8b317fb98e"} Dec 09 04:34:00 crc kubenswrapper[4766]: I1209 04:34:00.862959 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.862932893 podStartE2EDuration="7.862932893s" podCreationTimestamp="2025-12-09 04:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:34:00.84878054 +0000 UTC m=+4922.558085956" watchObservedRunningTime="2025-12-09 04:34:00.862932893 +0000 UTC m=+4922.572238309" Dec 09 04:34:01 crc kubenswrapper[4766]: I1209 04:34:01.801314 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ad2a4151-7484-4aad-8fc9-e7c7bbbb753c","Type":"ContainerStarted","Data":"8032acfbfb1a4837247e6473a8c8717375082386c35e87dd5a2a246c868b0ec1"} Dec 09 04:34:01 crc kubenswrapper[4766]: I1209 04:34:01.849260 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.849238004 podStartE2EDuration="7.849238004s" podCreationTimestamp="2025-12-09 04:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:34:01.83471609 +0000 UTC m=+4923.544021596" watchObservedRunningTime="2025-12-09 04:34:01.849238004 +0000 UTC m=+4923.558543470" Dec 09 04:34:01 crc kubenswrapper[4766]: I1209 04:34:01.998533 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:34:02 crc kubenswrapper[4766]: I1209 04:34:02.273448 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:34:02 crc kubenswrapper[4766]: I1209 04:34:02.316994 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-9qcm6"] Dec 09 04:34:02 crc kubenswrapper[4766]: I1209 04:34:02.807493 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" podUID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" containerName="dnsmasq-dns" containerID="cri-o://a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f" gracePeriod=10 Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.231130 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.407816 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldvhc\" (UniqueName: \"kubernetes.io/projected/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-kube-api-access-ldvhc\") pod \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.407936 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-dns-svc\") pod \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.408027 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-config\") pod \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\" (UID: \"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed\") " Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.415056 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-kube-api-access-ldvhc" (OuterVolumeSpecName: "kube-api-access-ldvhc") pod "2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" (UID: "2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed"). InnerVolumeSpecName "kube-api-access-ldvhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.447061 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-config" (OuterVolumeSpecName: "config") pod "2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" (UID: "2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.449101 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" (UID: "2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.510386 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.510422 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.510439 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldvhc\" (UniqueName: \"kubernetes.io/projected/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed-kube-api-access-ldvhc\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.818328 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" containerID="a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f" exitCode=0 Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.818370 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" event={"ID":"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed","Type":"ContainerDied","Data":"a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f"} Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.818403 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.818431 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-9qcm6" event={"ID":"2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed","Type":"ContainerDied","Data":"8859b667d4b7f06031155812bea390242f777440d3648fb6d2dadb899ac85004"} Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.818460 4766 scope.go:117] "RemoveContainer" containerID="a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.859947 4766 scope.go:117] "RemoveContainer" containerID="9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.863286 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-9qcm6"] Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.875552 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-9qcm6"] Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.882544 4766 scope.go:117] "RemoveContainer" containerID="a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f" Dec 09 04:34:03 crc kubenswrapper[4766]: E1209 04:34:03.883109 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f\": container with ID starting with a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f not found: ID does not exist" containerID="a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.883173 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f"} err="failed to get container status \"a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f\": rpc error: code = NotFound desc = could not find container \"a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f\": container with ID starting with a0ac6e7ba1e1f80bc211449be74cb5e389d23ddc6397354692dce3255a0d577f not found: ID does not exist" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.883197 4766 scope.go:117] "RemoveContainer" containerID="9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56" Dec 09 04:34:03 crc kubenswrapper[4766]: E1209 04:34:03.883494 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56\": container with ID starting with 9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56 not found: ID does not exist" containerID="9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56" Dec 09 04:34:03 crc kubenswrapper[4766]: I1209 04:34:03.883530 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56"} err="failed to get container status \"9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56\": rpc error: code = NotFound desc = could not find container \"9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56\": container with ID starting with 9287db7c9882ebc572a92601fea985ccd60713ce81ce4a1c85b8a268fb499c56 not found: ID does not exist" Dec 09 04:34:04 crc kubenswrapper[4766]: I1209 04:34:04.848983 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" path="/var/lib/kubelet/pods/2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed/volumes" Dec 09 04:34:05 crc kubenswrapper[4766]: I1209 04:34:05.013642 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 09 04:34:05 crc kubenswrapper[4766]: I1209 04:34:05.013695 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 09 04:34:05 crc kubenswrapper[4766]: I1209 04:34:05.239701 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 09 04:34:05 crc kubenswrapper[4766]: I1209 04:34:05.937412 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 09 04:34:06 crc kubenswrapper[4766]: I1209 04:34:06.548316 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 09 04:34:06 crc kubenswrapper[4766]: I1209 04:34:06.548363 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 09 04:34:07 crc kubenswrapper[4766]: I1209 04:34:07.316797 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:34:07 crc kubenswrapper[4766]: I1209 04:34:07.317176 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:34:08 crc kubenswrapper[4766]: I1209 04:34:08.796654 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 09 04:34:08 crc kubenswrapper[4766]: I1209 04:34:08.878987 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 09 04:34:27 crc kubenswrapper[4766]: I1209 04:34:27.033548 4766 generic.go:334] "Generic (PLEG): container finished" podID="f7688639-6791-4900-8a65-849e5e501fda" containerID="20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a" exitCode=0 Dec 09 04:34:27 crc kubenswrapper[4766]: I1209 04:34:27.033647 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7688639-6791-4900-8a65-849e5e501fda","Type":"ContainerDied","Data":"20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a"} Dec 09 04:34:28 crc kubenswrapper[4766]: I1209 04:34:28.051804 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7688639-6791-4900-8a65-849e5e501fda","Type":"ContainerStarted","Data":"3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4"} Dec 09 04:34:28 crc kubenswrapper[4766]: I1209 04:34:28.052927 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 09 04:34:29 crc kubenswrapper[4766]: I1209 04:34:29.062352 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e475cce-e1a9-48d1-aede-42163597ab9f" containerID="a01bd4727041af3f878626add1757349ac196a31902d1ab24f3c042d331c70f0" exitCode=0 Dec 09 04:34:29 crc kubenswrapper[4766]: I1209 04:34:29.062469 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e475cce-e1a9-48d1-aede-42163597ab9f","Type":"ContainerDied","Data":"a01bd4727041af3f878626add1757349ac196a31902d1ab24f3c042d331c70f0"} Dec 09 04:34:29 crc kubenswrapper[4766]: I1209 04:34:29.105873 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.105850381 podStartE2EDuration="38.105850381s" podCreationTimestamp="2025-12-09 04:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:34:28.08479571 +0000 UTC m=+4949.794101176" watchObservedRunningTime="2025-12-09 04:34:29.105850381 +0000 UTC m=+4950.815155827" Dec 09 04:34:30 crc kubenswrapper[4766]: I1209 04:34:30.071873 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e475cce-e1a9-48d1-aede-42163597ab9f","Type":"ContainerStarted","Data":"6705a09b78a70c8b1b468fff349ecc14dd14207b3778188b524b9bb6cf777ce8"} Dec 09 04:34:30 crc kubenswrapper[4766]: I1209 04:34:30.072831 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:34:30 crc kubenswrapper[4766]: I1209 04:34:30.105840 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.105808281 podStartE2EDuration="38.105808281s" podCreationTimestamp="2025-12-09 04:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:34:30.101664039 +0000 UTC m=+4951.810969595" watchObservedRunningTime="2025-12-09 04:34:30.105808281 +0000 UTC m=+4951.815113747" Dec 09 04:34:37 crc kubenswrapper[4766]: I1209 04:34:37.316962 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:34:37 crc kubenswrapper[4766]: I1209 04:34:37.317692 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:34:37 crc kubenswrapper[4766]: I1209 04:34:37.317767 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:34:37 crc kubenswrapper[4766]: I1209 04:34:37.318787 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:34:37 crc kubenswrapper[4766]: I1209 04:34:37.318891 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" gracePeriod=600 Dec 09 04:34:37 crc kubenswrapper[4766]: E1209 04:34:37.445279 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:34:37 crc kubenswrapper[4766]: E1209 04:34:37.491042 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42b369b_e4ad_447c_b9b1_5c2461116838.slice/crio-204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42b369b_e4ad_447c_b9b1_5c2461116838.slice/crio-conmon-204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9.scope\": RecentStats: unable to find data in memory cache]" Dec 09 04:34:38 crc kubenswrapper[4766]: I1209 04:34:38.138132 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" exitCode=0 Dec 09 04:34:38 crc kubenswrapper[4766]: I1209 04:34:38.138209 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9"} Dec 09 04:34:38 crc kubenswrapper[4766]: I1209 04:34:38.138303 4766 scope.go:117] "RemoveContainer" containerID="af95de5f173efefe23a3858a8ceadb3e71c7f893eebc5787426af7094bd6a972" Dec 09 04:34:38 crc kubenswrapper[4766]: I1209 04:34:38.139135 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:34:38 crc kubenswrapper[4766]: E1209 04:34:38.139702 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:34:43 crc kubenswrapper[4766]: I1209 04:34:43.174243 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 09 04:34:43 crc kubenswrapper[4766]: I1209 04:34:43.494398 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:34:49 crc kubenswrapper[4766]: I1209 04:34:49.839352 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-b69z7"] Dec 09 04:34:49 crc kubenswrapper[4766]: E1209 04:34:49.840113 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" containerName="init" Dec 09 04:34:49 crc kubenswrapper[4766]: I1209 04:34:49.840126 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" containerName="init" Dec 09 04:34:49 crc kubenswrapper[4766]: E1209 04:34:49.840140 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" containerName="dnsmasq-dns" Dec 09 04:34:49 crc kubenswrapper[4766]: I1209 04:34:49.840146 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" containerName="dnsmasq-dns" Dec 09 04:34:49 crc kubenswrapper[4766]: I1209 04:34:49.840315 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8f1cfc-41aa-4004-b1d5-9fddd79bbfed" containerName="dnsmasq-dns" Dec 09 04:34:49 crc kubenswrapper[4766]: I1209 04:34:49.841078 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:49 crc kubenswrapper[4766]: I1209 04:34:49.853149 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-b69z7"] Dec 09 04:34:49 crc kubenswrapper[4766]: I1209 04:34:49.930082 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-b69z7\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:49 crc kubenswrapper[4766]: I1209 04:34:49.930383 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkj7k\" (UniqueName: \"kubernetes.io/projected/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-kube-api-access-tkj7k\") pod \"dnsmasq-dns-5b7946d7b9-b69z7\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:49 crc kubenswrapper[4766]: I1209 04:34:49.930464 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-config\") pod \"dnsmasq-dns-5b7946d7b9-b69z7\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.031519 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-b69z7\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.031664 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkj7k\" (UniqueName: \"kubernetes.io/projected/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-kube-api-access-tkj7k\") pod \"dnsmasq-dns-5b7946d7b9-b69z7\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.031694 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-config\") pod \"dnsmasq-dns-5b7946d7b9-b69z7\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.032853 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-config\") pod \"dnsmasq-dns-5b7946d7b9-b69z7\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.032861 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-b69z7\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.054200 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkj7k\" (UniqueName: \"kubernetes.io/projected/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-kube-api-access-tkj7k\") pod \"dnsmasq-dns-5b7946d7b9-b69z7\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.164293 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.573572 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.618508 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-b69z7"] Dec 09 04:34:50 crc kubenswrapper[4766]: I1209 04:34:50.839203 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:34:50 crc kubenswrapper[4766]: E1209 04:34:50.839658 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:34:51 crc kubenswrapper[4766]: I1209 04:34:51.269371 4766 generic.go:334] "Generic (PLEG): container finished" podID="1f0168a3-945b-49f7-9e2f-8ec5b7a66233" containerID="2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c" exitCode=0 Dec 09 04:34:51 crc kubenswrapper[4766]: I1209 04:34:51.269413 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" event={"ID":"1f0168a3-945b-49f7-9e2f-8ec5b7a66233","Type":"ContainerDied","Data":"2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c"} Dec 09 04:34:51 crc kubenswrapper[4766]: I1209 04:34:51.269438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" event={"ID":"1f0168a3-945b-49f7-9e2f-8ec5b7a66233","Type":"ContainerStarted","Data":"b68707598dddffa506655c27aa41a327ce5075f208811452a8974fa1d6920dbc"} Dec 09 04:34:51 crc kubenswrapper[4766]: I1209 04:34:51.270569 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 04:34:52 crc kubenswrapper[4766]: I1209 04:34:52.280309 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" event={"ID":"1f0168a3-945b-49f7-9e2f-8ec5b7a66233","Type":"ContainerStarted","Data":"8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e"} Dec 09 04:34:52 crc kubenswrapper[4766]: I1209 04:34:52.280818 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:34:52 crc kubenswrapper[4766]: I1209 04:34:52.308684 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" podStartSLOduration=3.308663914 podStartE2EDuration="3.308663914s" podCreationTimestamp="2025-12-09 04:34:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:34:52.306359072 +0000 UTC m=+4974.015664538" watchObservedRunningTime="2025-12-09 04:34:52.308663914 +0000 UTC m=+4974.017969360" Dec 09 04:34:52 crc kubenswrapper[4766]: I1209 04:34:52.383673 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f7688639-6791-4900-8a65-849e5e501fda" containerName="rabbitmq" containerID="cri-o://3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4" gracePeriod=604799 Dec 09 04:34:53 crc kubenswrapper[4766]: I1209 04:34:53.042058 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8e475cce-e1a9-48d1-aede-42163597ab9f" containerName="rabbitmq" containerID="cri-o://6705a09b78a70c8b1b468fff349ecc14dd14207b3778188b524b9bb6cf777ce8" gracePeriod=604799 Dec 09 04:34:53 crc kubenswrapper[4766]: I1209 04:34:53.170909 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f7688639-6791-4900-8a65-849e5e501fda" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.235:5672: connect: connection refused" Dec 09 04:34:53 crc kubenswrapper[4766]: I1209 04:34:53.492317 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8e475cce-e1a9-48d1-aede-42163597ab9f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.236:5672: connect: connection refused" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.175656 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g8hff"] Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.178705 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.197072 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8hff"] Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.252270 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/327bdf4b-36b9-44f2-8e3e-38fa0fccebf7-utilities\") pod \"certified-operators-g8hff\" (UID: \"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7\") " pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.252565 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87dqd\" (UniqueName: \"kubernetes.io/projected/327bdf4b-36b9-44f2-8e3e-38fa0fccebf7-kube-api-access-87dqd\") pod \"certified-operators-g8hff\" (UID: \"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7\") " pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.252844 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/327bdf4b-36b9-44f2-8e3e-38fa0fccebf7-catalog-content\") pod \"certified-operators-g8hff\" (UID: \"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7\") " pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.354798 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/327bdf4b-36b9-44f2-8e3e-38fa0fccebf7-utilities\") pod \"certified-operators-g8hff\" (UID: \"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7\") " pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.355147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87dqd\" (UniqueName: \"kubernetes.io/projected/327bdf4b-36b9-44f2-8e3e-38fa0fccebf7-kube-api-access-87dqd\") pod \"certified-operators-g8hff\" (UID: \"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7\") " pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.355332 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/327bdf4b-36b9-44f2-8e3e-38fa0fccebf7-catalog-content\") pod \"certified-operators-g8hff\" (UID: \"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7\") " pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.355651 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/327bdf4b-36b9-44f2-8e3e-38fa0fccebf7-utilities\") pod \"certified-operators-g8hff\" (UID: \"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7\") " pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.355755 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/327bdf4b-36b9-44f2-8e3e-38fa0fccebf7-catalog-content\") pod \"certified-operators-g8hff\" (UID: \"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7\") " pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.382106 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87dqd\" (UniqueName: \"kubernetes.io/projected/327bdf4b-36b9-44f2-8e3e-38fa0fccebf7-kube-api-access-87dqd\") pod \"certified-operators-g8hff\" (UID: \"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7\") " pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.553947 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:34:57 crc kubenswrapper[4766]: I1209 04:34:57.983837 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8hff"] Dec 09 04:34:58 crc kubenswrapper[4766]: I1209 04:34:58.327645 4766 generic.go:334] "Generic (PLEG): container finished" podID="327bdf4b-36b9-44f2-8e3e-38fa0fccebf7" containerID="77785d2c54e051e48ffca59be3dad94c3f57e2fd92c7d01ba91c187b40d00481" exitCode=0 Dec 09 04:34:58 crc kubenswrapper[4766]: I1209 04:34:58.327761 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8hff" event={"ID":"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7","Type":"ContainerDied","Data":"77785d2c54e051e48ffca59be3dad94c3f57e2fd92c7d01ba91c187b40d00481"} Dec 09 04:34:58 crc kubenswrapper[4766]: I1209 04:34:58.329321 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8hff" event={"ID":"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7","Type":"ContainerStarted","Data":"e57d60839bad44d58c411598c8575c705cd0d578404bd4f8bdc3c17b1eff64ba"} Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.045742 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.086251 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-confd\") pod \"f7688639-6791-4900-8a65-849e5e501fda\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.086310 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-plugins\") pod \"f7688639-6791-4900-8a65-849e5e501fda\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.086346 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-plugins-conf\") pod \"f7688639-6791-4900-8a65-849e5e501fda\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.086474 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") pod \"f7688639-6791-4900-8a65-849e5e501fda\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.086545 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9lc7\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-kube-api-access-b9lc7\") pod \"f7688639-6791-4900-8a65-849e5e501fda\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.086587 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-erlang-cookie\") pod \"f7688639-6791-4900-8a65-849e5e501fda\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.087224 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f7688639-6791-4900-8a65-849e5e501fda" (UID: "f7688639-6791-4900-8a65-849e5e501fda"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.087337 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f7688639-6791-4900-8a65-849e5e501fda" (UID: "f7688639-6791-4900-8a65-849e5e501fda"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.087736 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f7688639-6791-4900-8a65-849e5e501fda" (UID: "f7688639-6791-4900-8a65-849e5e501fda"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.087908 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7688639-6791-4900-8a65-849e5e501fda-pod-info\") pod \"f7688639-6791-4900-8a65-849e5e501fda\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.088557 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7688639-6791-4900-8a65-849e5e501fda-erlang-cookie-secret\") pod \"f7688639-6791-4900-8a65-849e5e501fda\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.088607 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-server-conf\") pod \"f7688639-6791-4900-8a65-849e5e501fda\" (UID: \"f7688639-6791-4900-8a65-849e5e501fda\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.089187 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.089364 4766 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.089410 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.101944 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-kube-api-access-b9lc7" (OuterVolumeSpecName: "kube-api-access-b9lc7") pod "f7688639-6791-4900-8a65-849e5e501fda" (UID: "f7688639-6791-4900-8a65-849e5e501fda"). InnerVolumeSpecName "kube-api-access-b9lc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.111623 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7688639-6791-4900-8a65-849e5e501fda-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f7688639-6791-4900-8a65-849e5e501fda" (UID: "f7688639-6791-4900-8a65-849e5e501fda"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.121064 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f7688639-6791-4900-8a65-849e5e501fda-pod-info" (OuterVolumeSpecName: "pod-info") pod "f7688639-6791-4900-8a65-849e5e501fda" (UID: "f7688639-6791-4900-8a65-849e5e501fda"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.132388 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-server-conf" (OuterVolumeSpecName: "server-conf") pod "f7688639-6791-4900-8a65-849e5e501fda" (UID: "f7688639-6791-4900-8a65-849e5e501fda"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.134557 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088" (OuterVolumeSpecName: "persistence") pod "f7688639-6791-4900-8a65-849e5e501fda" (UID: "f7688639-6791-4900-8a65-849e5e501fda"). InnerVolumeSpecName "pvc-2e91d212-aae8-4627-af9f-818659632088". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.192294 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") on node \"crc\" " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.192325 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9lc7\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-kube-api-access-b9lc7\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.192336 4766 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7688639-6791-4900-8a65-849e5e501fda-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.192348 4766 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7688639-6791-4900-8a65-849e5e501fda-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.192357 4766 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f7688639-6791-4900-8a65-849e5e501fda-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.216561 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f7688639-6791-4900-8a65-849e5e501fda" (UID: "f7688639-6791-4900-8a65-849e5e501fda"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.219288 4766 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.219464 4766 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2e91d212-aae8-4627-af9f-818659632088" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088") on node "crc" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.293070 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7688639-6791-4900-8a65-849e5e501fda-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.293110 4766 reconciler_common.go:293] "Volume detached for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.339848 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e475cce-e1a9-48d1-aede-42163597ab9f" containerID="6705a09b78a70c8b1b468fff349ecc14dd14207b3778188b524b9bb6cf777ce8" exitCode=0 Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.339936 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e475cce-e1a9-48d1-aede-42163597ab9f","Type":"ContainerDied","Data":"6705a09b78a70c8b1b468fff349ecc14dd14207b3778188b524b9bb6cf777ce8"} Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.341833 4766 generic.go:334] "Generic (PLEG): container finished" podID="f7688639-6791-4900-8a65-849e5e501fda" containerID="3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4" exitCode=0 Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.341861 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7688639-6791-4900-8a65-849e5e501fda","Type":"ContainerDied","Data":"3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4"} Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.341898 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.341920 4766 scope.go:117] "RemoveContainer" containerID="3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.341906 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f7688639-6791-4900-8a65-849e5e501fda","Type":"ContainerDied","Data":"643e8c66c20004b284cb2343923dd5fb0f25f90c2e0a3fd64d8d943ba77fcfc1"} Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.364811 4766 scope.go:117] "RemoveContainer" containerID="20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.380080 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.388515 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.407351 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 04:34:59 crc kubenswrapper[4766]: E1209 04:34:59.408466 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7688639-6791-4900-8a65-849e5e501fda" containerName="setup-container" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.408484 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7688639-6791-4900-8a65-849e5e501fda" containerName="setup-container" Dec 09 04:34:59 crc kubenswrapper[4766]: E1209 04:34:59.408524 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7688639-6791-4900-8a65-849e5e501fda" containerName="rabbitmq" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.408532 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7688639-6791-4900-8a65-849e5e501fda" containerName="rabbitmq" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.408898 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7688639-6791-4900-8a65-849e5e501fda" containerName="rabbitmq" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.410329 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.415004 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pr6vt" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.416794 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.417011 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.420405 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.424054 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.424064 4766 scope.go:117] "RemoveContainer" containerID="3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4" Dec 09 04:34:59 crc kubenswrapper[4766]: E1209 04:34:59.424667 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4\": container with ID starting with 3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4 not found: ID does not exist" containerID="3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.424706 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4"} err="failed to get container status \"3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4\": rpc error: code = NotFound desc = could not find container \"3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4\": container with ID starting with 3290205bf1877e9c544cb0bb55895bbf7d2613d76bfee850d883776cb4b3eaf4 not found: ID does not exist" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.424740 4766 scope.go:117] "RemoveContainer" containerID="20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a" Dec 09 04:34:59 crc kubenswrapper[4766]: E1209 04:34:59.427541 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a\": container with ID starting with 20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a not found: ID does not exist" containerID="20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.427888 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a"} err="failed to get container status \"20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a\": rpc error: code = NotFound desc = could not find container \"20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a\": container with ID starting with 20099de524b29b12e8f96bb2887a5aa282bd635bbaa5b0f660bba2465eedf29a not found: ID does not exist" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.432890 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.500336 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.500393 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.500417 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.500467 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnlpx\" (UniqueName: \"kubernetes.io/projected/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-kube-api-access-gnlpx\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.500500 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.500528 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.500555 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.500610 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.500670 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.555037 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jprlb"] Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.563910 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.574750 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.582321 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jprlb"] Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.602404 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-plugins\") pod \"8e475cce-e1a9-48d1-aede-42163597ab9f\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.602801 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8e475cce-e1a9-48d1-aede-42163597ab9f" (UID: "8e475cce-e1a9-48d1-aede-42163597ab9f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.602883 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-confd\") pod \"8e475cce-e1a9-48d1-aede-42163597ab9f\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.602922 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-server-conf\") pod \"8e475cce-e1a9-48d1-aede-42163597ab9f\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603001 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") pod \"8e475cce-e1a9-48d1-aede-42163597ab9f\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603031 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-erlang-cookie\") pod \"8e475cce-e1a9-48d1-aede-42163597ab9f\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603072 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e475cce-e1a9-48d1-aede-42163597ab9f-pod-info\") pod \"8e475cce-e1a9-48d1-aede-42163597ab9f\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603108 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e475cce-e1a9-48d1-aede-42163597ab9f-erlang-cookie-secret\") pod \"8e475cce-e1a9-48d1-aede-42163597ab9f\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603129 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j29z7\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-kube-api-access-j29z7\") pod \"8e475cce-e1a9-48d1-aede-42163597ab9f\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603150 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-plugins-conf\") pod \"8e475cce-e1a9-48d1-aede-42163597ab9f\" (UID: \"8e475cce-e1a9-48d1-aede-42163597ab9f\") " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603313 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnlpx\" (UniqueName: \"kubernetes.io/projected/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-kube-api-access-gnlpx\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603347 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603368 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603396 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m865s\" (UniqueName: \"kubernetes.io/projected/1615f7d0-334f-4e62-88de-0e7af2962ac4-kube-api-access-m865s\") pod \"community-operators-jprlb\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603417 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603437 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603461 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603503 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603524 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-catalog-content\") pod \"community-operators-jprlb\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603545 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603560 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-utilities\") pod \"community-operators-jprlb\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603577 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.603622 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.606104 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.608435 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.608715 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.612866 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8e475cce-e1a9-48d1-aede-42163597ab9f" (UID: "8e475cce-e1a9-48d1-aede-42163597ab9f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.614098 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8e475cce-e1a9-48d1-aede-42163597ab9f" (UID: "8e475cce-e1a9-48d1-aede-42163597ab9f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.614774 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.617350 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e475cce-e1a9-48d1-aede-42163597ab9f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8e475cce-e1a9-48d1-aede-42163597ab9f" (UID: "8e475cce-e1a9-48d1-aede-42163597ab9f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.620422 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.621834 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.621885 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d35f51cb7ce9e67be7893575eb60711ab0d51a89814f019d626546810c453818/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.621999 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.622388 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.623680 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8e475cce-e1a9-48d1-aede-42163597ab9f-pod-info" (OuterVolumeSpecName: "pod-info") pod "8e475cce-e1a9-48d1-aede-42163597ab9f" (UID: "8e475cce-e1a9-48d1-aede-42163597ab9f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.643670 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnlpx\" (UniqueName: \"kubernetes.io/projected/74a6e9a7-3db2-49eb-bbaf-40536dcc8d88-kube-api-access-gnlpx\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.645554 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc" (OuterVolumeSpecName: "persistence") pod "8e475cce-e1a9-48d1-aede-42163597ab9f" (UID: "8e475cce-e1a9-48d1-aede-42163597ab9f"). InnerVolumeSpecName "pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.670764 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-kube-api-access-j29z7" (OuterVolumeSpecName: "kube-api-access-j29z7") pod "8e475cce-e1a9-48d1-aede-42163597ab9f" (UID: "8e475cce-e1a9-48d1-aede-42163597ab9f"). InnerVolumeSpecName "kube-api-access-j29z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.671936 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-server-conf" (OuterVolumeSpecName: "server-conf") pod "8e475cce-e1a9-48d1-aede-42163597ab9f" (UID: "8e475cce-e1a9-48d1-aede-42163597ab9f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707482 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m865s\" (UniqueName: \"kubernetes.io/projected/1615f7d0-334f-4e62-88de-0e7af2962ac4-kube-api-access-m865s\") pod \"community-operators-jprlb\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707626 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-catalog-content\") pod \"community-operators-jprlb\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707673 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-utilities\") pod \"community-operators-jprlb\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707817 4766 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707845 4766 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8e475cce-e1a9-48d1-aede-42163597ab9f-server-conf\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707889 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") on node \"crc\" " Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707906 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707923 4766 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8e475cce-e1a9-48d1-aede-42163597ab9f-pod-info\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707941 4766 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8e475cce-e1a9-48d1-aede-42163597ab9f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.707956 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j29z7\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-kube-api-access-j29z7\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.710559 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-utilities\") pod \"community-operators-jprlb\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.715611 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-catalog-content\") pod \"community-operators-jprlb\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.736312 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m865s\" (UniqueName: \"kubernetes.io/projected/1615f7d0-334f-4e62-88de-0e7af2962ac4-kube-api-access-m865s\") pod \"community-operators-jprlb\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.747255 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2e91d212-aae8-4627-af9f-818659632088\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e91d212-aae8-4627-af9f-818659632088\") pod \"rabbitmq-server-0\" (UID: \"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88\") " pod="openstack/rabbitmq-server-0" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.770993 4766 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.771238 4766 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc") on node "crc" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.806469 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8e475cce-e1a9-48d1-aede-42163597ab9f" (UID: "8e475cce-e1a9-48d1-aede-42163597ab9f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.809779 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8e475cce-e1a9-48d1-aede-42163597ab9f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.809816 4766 reconciler_common.go:293] "Volume detached for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") on node \"crc\" DevicePath \"\"" Dec 09 04:34:59 crc kubenswrapper[4766]: I1209 04:34:59.990533 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.044778 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.164280 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nshvg"] Dec 09 04:35:00 crc kubenswrapper[4766]: E1209 04:35:00.165065 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e475cce-e1a9-48d1-aede-42163597ab9f" containerName="setup-container" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.165083 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e475cce-e1a9-48d1-aede-42163597ab9f" containerName="setup-container" Dec 09 04:35:00 crc kubenswrapper[4766]: E1209 04:35:00.165108 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e475cce-e1a9-48d1-aede-42163597ab9f" containerName="rabbitmq" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.165115 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e475cce-e1a9-48d1-aede-42163597ab9f" containerName="rabbitmq" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.165286 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e475cce-e1a9-48d1-aede-42163597ab9f" containerName="rabbitmq" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.166400 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.168327 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.170592 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nshvg"] Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.216288 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-utilities\") pod \"redhat-marketplace-nshvg\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.216337 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-catalog-content\") pod \"redhat-marketplace-nshvg\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.216362 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj7jj\" (UniqueName: \"kubernetes.io/projected/1ceac5f1-9780-4367-be2c-d71b951a9be9-kube-api-access-bj7jj\") pod \"redhat-marketplace-nshvg\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.255180 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-kfgb8"] Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.255839 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" podUID="29311969-a272-4017-b1a6-be36bf991edd" containerName="dnsmasq-dns" containerID="cri-o://80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416" gracePeriod=10 Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.294297 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jprlb"] Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.318256 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-utilities\") pod \"redhat-marketplace-nshvg\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.320766 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-catalog-content\") pod \"redhat-marketplace-nshvg\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.320799 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj7jj\" (UniqueName: \"kubernetes.io/projected/1ceac5f1-9780-4367-be2c-d71b951a9be9-kube-api-access-bj7jj\") pod \"redhat-marketplace-nshvg\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.321429 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-utilities\") pod \"redhat-marketplace-nshvg\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.321583 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-catalog-content\") pod \"redhat-marketplace-nshvg\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.350231 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj7jj\" (UniqueName: \"kubernetes.io/projected/1ceac5f1-9780-4367-be2c-d71b951a9be9-kube-api-access-bj7jj\") pod \"redhat-marketplace-nshvg\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.362973 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8e475cce-e1a9-48d1-aede-42163597ab9f","Type":"ContainerDied","Data":"19d4af020a5fb9af0c11daa06272e00327faed933b2fd6912965deca065390b4"} Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.363028 4766 scope.go:117] "RemoveContainer" containerID="6705a09b78a70c8b1b468fff349ecc14dd14207b3778188b524b9bb6cf777ce8" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.363201 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.370846 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprlb" event={"ID":"1615f7d0-334f-4e62-88de-0e7af2962ac4","Type":"ContainerStarted","Data":"2de763e37071f6cf31d521d1dd57060dbb5957a1956369265206c1a5b4a95c66"} Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.398633 4766 scope.go:117] "RemoveContainer" containerID="a01bd4727041af3f878626add1757349ac196a31902d1ab24f3c042d331c70f0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.434690 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.439943 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.455888 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.457111 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.460781 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-85shk" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.460945 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.462608 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.462842 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.462965 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.498483 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.505789 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.527889 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.528050 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.528079 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.528104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.528127 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.528191 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.528238 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sn6s\" (UniqueName: \"kubernetes.io/projected/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-kube-api-access-5sn6s\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.528286 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.528315 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.590665 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 09 04:35:00 crc kubenswrapper[4766]: W1209 04:35:00.600622 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74a6e9a7_3db2_49eb_bbaf_40536dcc8d88.slice/crio-982a367f6f8d399ad70b65be41025097746b27f18843b9080de013b313ee5e67 WatchSource:0}: Error finding container 982a367f6f8d399ad70b65be41025097746b27f18843b9080de013b313ee5e67: Status 404 returned error can't find the container with id 982a367f6f8d399ad70b65be41025097746b27f18843b9080de013b313ee5e67 Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.629298 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.629378 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.629405 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.629427 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.629463 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.629488 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.629521 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.629605 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sn6s\" (UniqueName: \"kubernetes.io/projected/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-kube-api-access-5sn6s\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.629625 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.630595 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.630663 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.631018 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.632725 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.634821 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.636787 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.637449 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.637485 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f2701008402cad3cf1a00c3b11fcd7bdf81ee0cb8484f7efc79ac3b40f431cba/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.652565 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sn6s\" (UniqueName: \"kubernetes.io/projected/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-kube-api-access-5sn6s\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.655157 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bdcb35ea-4aea-4b76-a13e-6fd4b9398991-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.689934 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07f80ca3-27ca-4b09-8516-d4b08d7d2bbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bdcb35ea-4aea-4b76-a13e-6fd4b9398991\") " pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.774303 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.872441 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e475cce-e1a9-48d1-aede-42163597ab9f" path="/var/lib/kubelet/pods/8e475cce-e1a9-48d1-aede-42163597ab9f/volumes" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.873569 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7688639-6791-4900-8a65-849e5e501fda" path="/var/lib/kubelet/pods/f7688639-6791-4900-8a65-849e5e501fda/volumes" Dec 09 04:35:00 crc kubenswrapper[4766]: I1209 04:35:00.875367 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.010664 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nshvg"] Dec 09 04:35:01 crc kubenswrapper[4766]: W1209 04:35:01.014249 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ceac5f1_9780_4367_be2c_d71b951a9be9.slice/crio-6fb4b913b9a00fc921a5d7533c91300bdaa28d4502664c78bf1fc355fc7b32f4 WatchSource:0}: Error finding container 6fb4b913b9a00fc921a5d7533c91300bdaa28d4502664c78bf1fc355fc7b32f4: Status 404 returned error can't find the container with id 6fb4b913b9a00fc921a5d7533c91300bdaa28d4502664c78bf1fc355fc7b32f4 Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.035290 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-dns-svc\") pod \"29311969-a272-4017-b1a6-be36bf991edd\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.035343 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9gx9\" (UniqueName: \"kubernetes.io/projected/29311969-a272-4017-b1a6-be36bf991edd-kube-api-access-w9gx9\") pod \"29311969-a272-4017-b1a6-be36bf991edd\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.035395 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-config\") pod \"29311969-a272-4017-b1a6-be36bf991edd\" (UID: \"29311969-a272-4017-b1a6-be36bf991edd\") " Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.041512 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29311969-a272-4017-b1a6-be36bf991edd-kube-api-access-w9gx9" (OuterVolumeSpecName: "kube-api-access-w9gx9") pod "29311969-a272-4017-b1a6-be36bf991edd" (UID: "29311969-a272-4017-b1a6-be36bf991edd"). InnerVolumeSpecName "kube-api-access-w9gx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.071910 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29311969-a272-4017-b1a6-be36bf991edd" (UID: "29311969-a272-4017-b1a6-be36bf991edd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.084897 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-config" (OuterVolumeSpecName: "config") pod "29311969-a272-4017-b1a6-be36bf991edd" (UID: "29311969-a272-4017-b1a6-be36bf991edd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.136824 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.136942 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9gx9\" (UniqueName: \"kubernetes.io/projected/29311969-a272-4017-b1a6-be36bf991edd-kube-api-access-w9gx9\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.136955 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29311969-a272-4017-b1a6-be36bf991edd-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.206070 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 09 04:35:01 crc kubenswrapper[4766]: W1209 04:35:01.243944 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdcb35ea_4aea_4b76_a13e_6fd4b9398991.slice/crio-11945bf7d55bbb63eafb96423187129cf76f1fdf478c7dded9e1692209783cf8 WatchSource:0}: Error finding container 11945bf7d55bbb63eafb96423187129cf76f1fdf478c7dded9e1692209783cf8: Status 404 returned error can't find the container with id 11945bf7d55bbb63eafb96423187129cf76f1fdf478c7dded9e1692209783cf8 Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.400091 4766 generic.go:334] "Generic (PLEG): container finished" podID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerID="ada3c6a5ee365dfde94cb090ab54ff0447051030b37b0a2c0dd0247a4266b8ba" exitCode=0 Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.400151 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprlb" event={"ID":"1615f7d0-334f-4e62-88de-0e7af2962ac4","Type":"ContainerDied","Data":"ada3c6a5ee365dfde94cb090ab54ff0447051030b37b0a2c0dd0247a4266b8ba"} Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.403673 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerID="1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43" exitCode=0 Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.403744 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nshvg" event={"ID":"1ceac5f1-9780-4367-be2c-d71b951a9be9","Type":"ContainerDied","Data":"1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43"} Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.403771 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nshvg" event={"ID":"1ceac5f1-9780-4367-be2c-d71b951a9be9","Type":"ContainerStarted","Data":"6fb4b913b9a00fc921a5d7533c91300bdaa28d4502664c78bf1fc355fc7b32f4"} Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.409024 4766 generic.go:334] "Generic (PLEG): container finished" podID="29311969-a272-4017-b1a6-be36bf991edd" containerID="80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416" exitCode=0 Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.409102 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.409124 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" event={"ID":"29311969-a272-4017-b1a6-be36bf991edd","Type":"ContainerDied","Data":"80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416"} Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.409167 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-kfgb8" event={"ID":"29311969-a272-4017-b1a6-be36bf991edd","Type":"ContainerDied","Data":"4d97767b4b3d0a9851902eb1b80dee93f4a3dd82287aabd9be478c34ce7e3cfb"} Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.409198 4766 scope.go:117] "RemoveContainer" containerID="80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.411617 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bdcb35ea-4aea-4b76-a13e-6fd4b9398991","Type":"ContainerStarted","Data":"11945bf7d55bbb63eafb96423187129cf76f1fdf478c7dded9e1692209783cf8"} Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.412797 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88","Type":"ContainerStarted","Data":"982a367f6f8d399ad70b65be41025097746b27f18843b9080de013b313ee5e67"} Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.440958 4766 scope.go:117] "RemoveContainer" containerID="ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.459946 4766 scope.go:117] "RemoveContainer" containerID="80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.461086 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-kfgb8"] Dec 09 04:35:01 crc kubenswrapper[4766]: E1209 04:35:01.461128 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416\": container with ID starting with 80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416 not found: ID does not exist" containerID="80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.461157 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416"} err="failed to get container status \"80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416\": rpc error: code = NotFound desc = could not find container \"80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416\": container with ID starting with 80dcd81672bf2e3bc5316143c18f3a07dcf83ec14383407a19c31eed11641416 not found: ID does not exist" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.461183 4766 scope.go:117] "RemoveContainer" containerID="ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88" Dec 09 04:35:01 crc kubenswrapper[4766]: E1209 04:35:01.461548 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88\": container with ID starting with ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88 not found: ID does not exist" containerID="ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.461578 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88"} err="failed to get container status \"ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88\": rpc error: code = NotFound desc = could not find container \"ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88\": container with ID starting with ade155855c0a9d86f2a2f99042188d445833b41f1a45626cfd42c00203740d88 not found: ID does not exist" Dec 09 04:35:01 crc kubenswrapper[4766]: I1209 04:35:01.467884 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-kfgb8"] Dec 09 04:35:02 crc kubenswrapper[4766]: I1209 04:35:02.422577 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bdcb35ea-4aea-4b76-a13e-6fd4b9398991","Type":"ContainerStarted","Data":"27168278e98039ff2f749a5eff800db5459660766f03f1110d7259c8c8f7808a"} Dec 09 04:35:02 crc kubenswrapper[4766]: I1209 04:35:02.424121 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88","Type":"ContainerStarted","Data":"1b1fa814275f73688121c9e69b6da9bbe033d47a7324e9fd1a85389131d2e956"} Dec 09 04:35:02 crc kubenswrapper[4766]: I1209 04:35:02.839939 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:35:02 crc kubenswrapper[4766]: E1209 04:35:02.840185 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:35:02 crc kubenswrapper[4766]: I1209 04:35:02.859289 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29311969-a272-4017-b1a6-be36bf991edd" path="/var/lib/kubelet/pods/29311969-a272-4017-b1a6-be36bf991edd/volumes" Dec 09 04:35:04 crc kubenswrapper[4766]: I1209 04:35:04.441614 4766 generic.go:334] "Generic (PLEG): container finished" podID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerID="d088ed33915e4b90b4cdd4274a741ec6a2177c2bf71e175898e8bf38d4261de7" exitCode=0 Dec 09 04:35:04 crc kubenswrapper[4766]: I1209 04:35:04.441681 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprlb" event={"ID":"1615f7d0-334f-4e62-88de-0e7af2962ac4","Type":"ContainerDied","Data":"d088ed33915e4b90b4cdd4274a741ec6a2177c2bf71e175898e8bf38d4261de7"} Dec 09 04:35:04 crc kubenswrapper[4766]: I1209 04:35:04.445022 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerID="d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303" exitCode=0 Dec 09 04:35:04 crc kubenswrapper[4766]: I1209 04:35:04.446083 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nshvg" event={"ID":"1ceac5f1-9780-4367-be2c-d71b951a9be9","Type":"ContainerDied","Data":"d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303"} Dec 09 04:35:04 crc kubenswrapper[4766]: I1209 04:35:04.451603 4766 generic.go:334] "Generic (PLEG): container finished" podID="327bdf4b-36b9-44f2-8e3e-38fa0fccebf7" containerID="fac7bcfa309b764f5cfb9d51ec3716ffc6301cc4952d3cd338104d25ae106e41" exitCode=0 Dec 09 04:35:04 crc kubenswrapper[4766]: I1209 04:35:04.451658 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8hff" event={"ID":"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7","Type":"ContainerDied","Data":"fac7bcfa309b764f5cfb9d51ec3716ffc6301cc4952d3cd338104d25ae106e41"} Dec 09 04:35:05 crc kubenswrapper[4766]: I1209 04:35:05.461732 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprlb" event={"ID":"1615f7d0-334f-4e62-88de-0e7af2962ac4","Type":"ContainerStarted","Data":"e7cb6150b4da1f6da7973f6f3db9daef39f14dd35c6816e3f863a7e6ecdc6615"} Dec 09 04:35:05 crc kubenswrapper[4766]: I1209 04:35:05.463776 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nshvg" event={"ID":"1ceac5f1-9780-4367-be2c-d71b951a9be9","Type":"ContainerStarted","Data":"9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c"} Dec 09 04:35:05 crc kubenswrapper[4766]: I1209 04:35:05.465738 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8hff" event={"ID":"327bdf4b-36b9-44f2-8e3e-38fa0fccebf7","Type":"ContainerStarted","Data":"316d146054777bc2a319b762b33edc5d74bb34696766ea1a7ea9ce8cb9b934ca"} Dec 09 04:35:05 crc kubenswrapper[4766]: I1209 04:35:05.488903 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jprlb" podStartSLOduration=3.024917348 podStartE2EDuration="6.488882753s" podCreationTimestamp="2025-12-09 04:34:59 +0000 UTC" firstStartedPulling="2025-12-09 04:35:01.401281695 +0000 UTC m=+4983.110587121" lastFinishedPulling="2025-12-09 04:35:04.86524706 +0000 UTC m=+4986.574552526" observedRunningTime="2025-12-09 04:35:05.483730633 +0000 UTC m=+4987.193036059" watchObservedRunningTime="2025-12-09 04:35:05.488882753 +0000 UTC m=+4987.198188179" Dec 09 04:35:05 crc kubenswrapper[4766]: I1209 04:35:05.510409 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nshvg" podStartSLOduration=2.067050019 podStartE2EDuration="5.510389455s" podCreationTimestamp="2025-12-09 04:35:00 +0000 UTC" firstStartedPulling="2025-12-09 04:35:01.405417307 +0000 UTC m=+4983.114722733" lastFinishedPulling="2025-12-09 04:35:04.848756723 +0000 UTC m=+4986.558062169" observedRunningTime="2025-12-09 04:35:05.507822386 +0000 UTC m=+4987.217127822" watchObservedRunningTime="2025-12-09 04:35:05.510389455 +0000 UTC m=+4987.219694891" Dec 09 04:35:05 crc kubenswrapper[4766]: I1209 04:35:05.540982 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g8hff" podStartSLOduration=1.938260166 podStartE2EDuration="8.540956692s" podCreationTimestamp="2025-12-09 04:34:57 +0000 UTC" firstStartedPulling="2025-12-09 04:34:58.329329192 +0000 UTC m=+4980.038634658" lastFinishedPulling="2025-12-09 04:35:04.932025728 +0000 UTC m=+4986.641331184" observedRunningTime="2025-12-09 04:35:05.534648261 +0000 UTC m=+4987.243953697" watchObservedRunningTime="2025-12-09 04:35:05.540956692 +0000 UTC m=+4987.250262118" Dec 09 04:35:07 crc kubenswrapper[4766]: I1209 04:35:07.554427 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:35:07 crc kubenswrapper[4766]: I1209 04:35:07.554494 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:35:07 crc kubenswrapper[4766]: I1209 04:35:07.606630 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:35:09 crc kubenswrapper[4766]: I1209 04:35:09.991507 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:35:09 crc kubenswrapper[4766]: I1209 04:35:09.991736 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:35:10 crc kubenswrapper[4766]: I1209 04:35:10.077156 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:35:10 crc kubenswrapper[4766]: I1209 04:35:10.506884 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:10 crc kubenswrapper[4766]: I1209 04:35:10.506964 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:10 crc kubenswrapper[4766]: I1209 04:35:10.580291 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:10 crc kubenswrapper[4766]: I1209 04:35:10.594242 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:35:11 crc kubenswrapper[4766]: I1209 04:35:11.582270 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:12 crc kubenswrapper[4766]: I1209 04:35:12.345191 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jprlb"] Dec 09 04:35:12 crc kubenswrapper[4766]: I1209 04:35:12.530782 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jprlb" podUID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerName="registry-server" containerID="cri-o://e7cb6150b4da1f6da7973f6f3db9daef39f14dd35c6816e3f863a7e6ecdc6615" gracePeriod=2 Dec 09 04:35:12 crc kubenswrapper[4766]: I1209 04:35:12.948484 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nshvg"] Dec 09 04:35:13 crc kubenswrapper[4766]: I1209 04:35:13.538000 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nshvg" podUID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerName="registry-server" containerID="cri-o://9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c" gracePeriod=2 Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.447485 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.528853 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj7jj\" (UniqueName: \"kubernetes.io/projected/1ceac5f1-9780-4367-be2c-d71b951a9be9-kube-api-access-bj7jj\") pod \"1ceac5f1-9780-4367-be2c-d71b951a9be9\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.528911 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-catalog-content\") pod \"1ceac5f1-9780-4367-be2c-d71b951a9be9\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.528969 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-utilities\") pod \"1ceac5f1-9780-4367-be2c-d71b951a9be9\" (UID: \"1ceac5f1-9780-4367-be2c-d71b951a9be9\") " Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.530466 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-utilities" (OuterVolumeSpecName: "utilities") pod "1ceac5f1-9780-4367-be2c-d71b951a9be9" (UID: "1ceac5f1-9780-4367-be2c-d71b951a9be9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.534507 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ceac5f1-9780-4367-be2c-d71b951a9be9-kube-api-access-bj7jj" (OuterVolumeSpecName: "kube-api-access-bj7jj") pod "1ceac5f1-9780-4367-be2c-d71b951a9be9" (UID: "1ceac5f1-9780-4367-be2c-d71b951a9be9"). InnerVolumeSpecName "kube-api-access-bj7jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.549379 4766 generic.go:334] "Generic (PLEG): container finished" podID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerID="e7cb6150b4da1f6da7973f6f3db9daef39f14dd35c6816e3f863a7e6ecdc6615" exitCode=0 Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.549460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprlb" event={"ID":"1615f7d0-334f-4e62-88de-0e7af2962ac4","Type":"ContainerDied","Data":"e7cb6150b4da1f6da7973f6f3db9daef39f14dd35c6816e3f863a7e6ecdc6615"} Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.551452 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerID="9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c" exitCode=0 Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.551490 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nshvg" event={"ID":"1ceac5f1-9780-4367-be2c-d71b951a9be9","Type":"ContainerDied","Data":"9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c"} Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.551525 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nshvg" event={"ID":"1ceac5f1-9780-4367-be2c-d71b951a9be9","Type":"ContainerDied","Data":"6fb4b913b9a00fc921a5d7533c91300bdaa28d4502664c78bf1fc355fc7b32f4"} Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.551554 4766 scope.go:117] "RemoveContainer" containerID="9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.551564 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nshvg" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.569623 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ceac5f1-9780-4367-be2c-d71b951a9be9" (UID: "1ceac5f1-9780-4367-be2c-d71b951a9be9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.584713 4766 scope.go:117] "RemoveContainer" containerID="d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.611065 4766 scope.go:117] "RemoveContainer" containerID="1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.631320 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj7jj\" (UniqueName: \"kubernetes.io/projected/1ceac5f1-9780-4367-be2c-d71b951a9be9-kube-api-access-bj7jj\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.631366 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.631379 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ceac5f1-9780-4367-be2c-d71b951a9be9-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.648583 4766 scope.go:117] "RemoveContainer" containerID="9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c" Dec 09 04:35:14 crc kubenswrapper[4766]: E1209 04:35:14.649061 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c\": container with ID starting with 9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c not found: ID does not exist" containerID="9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.649108 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c"} err="failed to get container status \"9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c\": rpc error: code = NotFound desc = could not find container \"9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c\": container with ID starting with 9ffef51a858f047da061511679c90e3b9ef7b1f4d6806669f5a6547d6abd1c0c not found: ID does not exist" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.649134 4766 scope.go:117] "RemoveContainer" containerID="d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303" Dec 09 04:35:14 crc kubenswrapper[4766]: E1209 04:35:14.649667 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303\": container with ID starting with d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303 not found: ID does not exist" containerID="d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.649697 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303"} err="failed to get container status \"d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303\": rpc error: code = NotFound desc = could not find container \"d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303\": container with ID starting with d29e6cb44006de42e719291b63d3cbc221ee5e17b0be0cf37a61fd08f62bb303 not found: ID does not exist" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.649714 4766 scope.go:117] "RemoveContainer" containerID="1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43" Dec 09 04:35:14 crc kubenswrapper[4766]: E1209 04:35:14.650099 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43\": container with ID starting with 1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43 not found: ID does not exist" containerID="1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.650179 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43"} err="failed to get container status \"1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43\": rpc error: code = NotFound desc = could not find container \"1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43\": container with ID starting with 1350cd1b0977f7b0df8413f5ec93b67065f48201437f8bb1b74cd023198caf43 not found: ID does not exist" Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.893032 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nshvg"] Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.900876 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nshvg"] Dec 09 04:35:14 crc kubenswrapper[4766]: I1209 04:35:14.959163 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.037835 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m865s\" (UniqueName: \"kubernetes.io/projected/1615f7d0-334f-4e62-88de-0e7af2962ac4-kube-api-access-m865s\") pod \"1615f7d0-334f-4e62-88de-0e7af2962ac4\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.038194 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-catalog-content\") pod \"1615f7d0-334f-4e62-88de-0e7af2962ac4\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.038374 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-utilities\") pod \"1615f7d0-334f-4e62-88de-0e7af2962ac4\" (UID: \"1615f7d0-334f-4e62-88de-0e7af2962ac4\") " Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.039926 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-utilities" (OuterVolumeSpecName: "utilities") pod "1615f7d0-334f-4e62-88de-0e7af2962ac4" (UID: "1615f7d0-334f-4e62-88de-0e7af2962ac4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.045668 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1615f7d0-334f-4e62-88de-0e7af2962ac4-kube-api-access-m865s" (OuterVolumeSpecName: "kube-api-access-m865s") pod "1615f7d0-334f-4e62-88de-0e7af2962ac4" (UID: "1615f7d0-334f-4e62-88de-0e7af2962ac4"). InnerVolumeSpecName "kube-api-access-m865s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.106478 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1615f7d0-334f-4e62-88de-0e7af2962ac4" (UID: "1615f7d0-334f-4e62-88de-0e7af2962ac4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.140070 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.140105 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m865s\" (UniqueName: \"kubernetes.io/projected/1615f7d0-334f-4e62-88de-0e7af2962ac4-kube-api-access-m865s\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.140114 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1615f7d0-334f-4e62-88de-0e7af2962ac4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.565184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jprlb" event={"ID":"1615f7d0-334f-4e62-88de-0e7af2962ac4","Type":"ContainerDied","Data":"2de763e37071f6cf31d521d1dd57060dbb5957a1956369265206c1a5b4a95c66"} Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.565248 4766 scope.go:117] "RemoveContainer" containerID="e7cb6150b4da1f6da7973f6f3db9daef39f14dd35c6816e3f863a7e6ecdc6615" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.565317 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jprlb" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.592169 4766 scope.go:117] "RemoveContainer" containerID="d088ed33915e4b90b4cdd4274a741ec6a2177c2bf71e175898e8bf38d4261de7" Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.612063 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jprlb"] Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.617572 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jprlb"] Dec 09 04:35:15 crc kubenswrapper[4766]: I1209 04:35:15.627607 4766 scope.go:117] "RemoveContainer" containerID="ada3c6a5ee365dfde94cb090ab54ff0447051030b37b0a2c0dd0247a4266b8ba" Dec 09 04:35:16 crc kubenswrapper[4766]: I1209 04:35:16.840252 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:35:16 crc kubenswrapper[4766]: E1209 04:35:16.840972 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:35:16 crc kubenswrapper[4766]: I1209 04:35:16.860776 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1615f7d0-334f-4e62-88de-0e7af2962ac4" path="/var/lib/kubelet/pods/1615f7d0-334f-4e62-88de-0e7af2962ac4/volumes" Dec 09 04:35:16 crc kubenswrapper[4766]: I1209 04:35:16.862323 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ceac5f1-9780-4367-be2c-d71b951a9be9" path="/var/lib/kubelet/pods/1ceac5f1-9780-4367-be2c-d71b951a9be9/volumes" Dec 09 04:35:17 crc kubenswrapper[4766]: I1209 04:35:17.616062 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g8hff" Dec 09 04:35:20 crc kubenswrapper[4766]: I1209 04:35:20.390473 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8hff"] Dec 09 04:35:20 crc kubenswrapper[4766]: I1209 04:35:20.552065 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5kv77"] Dec 09 04:35:20 crc kubenswrapper[4766]: I1209 04:35:20.552410 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5kv77" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerName="registry-server" containerID="cri-o://63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f" gracePeriod=2 Dec 09 04:35:21 crc kubenswrapper[4766]: E1209 04:35:21.342475 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f is running failed: container process not found" containerID="63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 04:35:21 crc kubenswrapper[4766]: E1209 04:35:21.343009 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f is running failed: container process not found" containerID="63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 04:35:21 crc kubenswrapper[4766]: E1209 04:35:21.343425 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f is running failed: container process not found" containerID="63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 04:35:21 crc kubenswrapper[4766]: E1209 04:35:21.343504 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-5kv77" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerName="registry-server" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.479122 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.622510 4766 generic.go:334] "Generic (PLEG): container finished" podID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerID="63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f" exitCode=0 Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.622561 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kv77" event={"ID":"171fcfd0-caf1-4bbd-9298-c920c0c61de8","Type":"ContainerDied","Data":"63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f"} Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.622600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kv77" event={"ID":"171fcfd0-caf1-4bbd-9298-c920c0c61de8","Type":"ContainerDied","Data":"4f189451a317d6f2972ab75c14930d23ec029feaf9f32e5b3b4d2cab8da0d987"} Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.622602 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kv77" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.622617 4766 scope.go:117] "RemoveContainer" containerID="63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.651487 4766 scope.go:117] "RemoveContainer" containerID="61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.652308 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-catalog-content\") pod \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.652406 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjfvt\" (UniqueName: \"kubernetes.io/projected/171fcfd0-caf1-4bbd-9298-c920c0c61de8-kube-api-access-xjfvt\") pod \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.652542 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-utilities\") pod \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\" (UID: \"171fcfd0-caf1-4bbd-9298-c920c0c61de8\") " Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.653389 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-utilities" (OuterVolumeSpecName: "utilities") pod "171fcfd0-caf1-4bbd-9298-c920c0c61de8" (UID: "171fcfd0-caf1-4bbd-9298-c920c0c61de8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.658310 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171fcfd0-caf1-4bbd-9298-c920c0c61de8-kube-api-access-xjfvt" (OuterVolumeSpecName: "kube-api-access-xjfvt") pod "171fcfd0-caf1-4bbd-9298-c920c0c61de8" (UID: "171fcfd0-caf1-4bbd-9298-c920c0c61de8"). InnerVolumeSpecName "kube-api-access-xjfvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.675232 4766 scope.go:117] "RemoveContainer" containerID="b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.704041 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "171fcfd0-caf1-4bbd-9298-c920c0c61de8" (UID: "171fcfd0-caf1-4bbd-9298-c920c0c61de8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.724519 4766 scope.go:117] "RemoveContainer" containerID="63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f" Dec 09 04:35:21 crc kubenswrapper[4766]: E1209 04:35:21.724894 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f\": container with ID starting with 63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f not found: ID does not exist" containerID="63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.724921 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f"} err="failed to get container status \"63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f\": rpc error: code = NotFound desc = could not find container \"63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f\": container with ID starting with 63e71256fa4934eff7f086c68120b5d3e69327565b9a00f9b3986d7c2e19b76f not found: ID does not exist" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.724943 4766 scope.go:117] "RemoveContainer" containerID="61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444" Dec 09 04:35:21 crc kubenswrapper[4766]: E1209 04:35:21.725149 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444\": container with ID starting with 61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444 not found: ID does not exist" containerID="61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.725167 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444"} err="failed to get container status \"61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444\": rpc error: code = NotFound desc = could not find container \"61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444\": container with ID starting with 61b34fe4c36476689ced975b1e4fd5f5585c00f2818922f484d96e77d2a23444 not found: ID does not exist" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.725181 4766 scope.go:117] "RemoveContainer" containerID="b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43" Dec 09 04:35:21 crc kubenswrapper[4766]: E1209 04:35:21.725401 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43\": container with ID starting with b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43 not found: ID does not exist" containerID="b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.725416 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43"} err="failed to get container status \"b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43\": rpc error: code = NotFound desc = could not find container \"b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43\": container with ID starting with b826b7a915a9f828568b0549e8960f5e5e79df3ef9c23e976c4626cf1cdd3f43 not found: ID does not exist" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.753861 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.753901 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/171fcfd0-caf1-4bbd-9298-c920c0c61de8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.753911 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjfvt\" (UniqueName: \"kubernetes.io/projected/171fcfd0-caf1-4bbd-9298-c920c0c61de8-kube-api-access-xjfvt\") on node \"crc\" DevicePath \"\"" Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.950715 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5kv77"] Dec 09 04:35:21 crc kubenswrapper[4766]: I1209 04:35:21.957194 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5kv77"] Dec 09 04:35:22 crc kubenswrapper[4766]: I1209 04:35:22.851916 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" path="/var/lib/kubelet/pods/171fcfd0-caf1-4bbd-9298-c920c0c61de8/volumes" Dec 09 04:35:30 crc kubenswrapper[4766]: I1209 04:35:30.839740 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:35:30 crc kubenswrapper[4766]: E1209 04:35:30.840473 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:35:34 crc kubenswrapper[4766]: I1209 04:35:34.752523 4766 generic.go:334] "Generic (PLEG): container finished" podID="bdcb35ea-4aea-4b76-a13e-6fd4b9398991" containerID="27168278e98039ff2f749a5eff800db5459660766f03f1110d7259c8c8f7808a" exitCode=0 Dec 09 04:35:34 crc kubenswrapper[4766]: I1209 04:35:34.752703 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bdcb35ea-4aea-4b76-a13e-6fd4b9398991","Type":"ContainerDied","Data":"27168278e98039ff2f749a5eff800db5459660766f03f1110d7259c8c8f7808a"} Dec 09 04:35:34 crc kubenswrapper[4766]: I1209 04:35:34.760999 4766 generic.go:334] "Generic (PLEG): container finished" podID="74a6e9a7-3db2-49eb-bbaf-40536dcc8d88" containerID="1b1fa814275f73688121c9e69b6da9bbe033d47a7324e9fd1a85389131d2e956" exitCode=0 Dec 09 04:35:34 crc kubenswrapper[4766]: I1209 04:35:34.761043 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88","Type":"ContainerDied","Data":"1b1fa814275f73688121c9e69b6da9bbe033d47a7324e9fd1a85389131d2e956"} Dec 09 04:35:35 crc kubenswrapper[4766]: I1209 04:35:35.777890 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74a6e9a7-3db2-49eb-bbaf-40536dcc8d88","Type":"ContainerStarted","Data":"826e33c2b460854331b493366826a88883a2b6c43c752157c7c672a55fd09211"} Dec 09 04:35:35 crc kubenswrapper[4766]: I1209 04:35:35.778442 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 09 04:35:35 crc kubenswrapper[4766]: I1209 04:35:35.783730 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bdcb35ea-4aea-4b76-a13e-6fd4b9398991","Type":"ContainerStarted","Data":"651c14b50b4ceef49f0e15624e88bf169249ba27b7f8aa1a8c45d68f735a06b2"} Dec 09 04:35:35 crc kubenswrapper[4766]: I1209 04:35:35.784827 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:35 crc kubenswrapper[4766]: I1209 04:35:35.814637 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.814617514 podStartE2EDuration="36.814617514s" podCreationTimestamp="2025-12-09 04:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:35:35.808259322 +0000 UTC m=+5017.517564748" watchObservedRunningTime="2025-12-09 04:35:35.814617514 +0000 UTC m=+5017.523922950" Dec 09 04:35:35 crc kubenswrapper[4766]: I1209 04:35:35.839295 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.839262921 podStartE2EDuration="35.839262921s" podCreationTimestamp="2025-12-09 04:35:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:35:35.829536777 +0000 UTC m=+5017.538842223" watchObservedRunningTime="2025-12-09 04:35:35.839262921 +0000 UTC m=+5017.548568387" Dec 09 04:35:43 crc kubenswrapper[4766]: I1209 04:35:43.839720 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:35:43 crc kubenswrapper[4766]: E1209 04:35:43.841053 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:35:50 crc kubenswrapper[4766]: I1209 04:35:50.048457 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 09 04:35:50 crc kubenswrapper[4766]: I1209 04:35:50.778461 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 09 04:35:56 crc kubenswrapper[4766]: I1209 04:35:56.839269 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:35:56 crc kubenswrapper[4766]: E1209 04:35:56.841035 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.709691 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710352 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerName="extract-content" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710371 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerName="extract-content" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710384 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerName="extract-utilities" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710393 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerName="extract-utilities" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710410 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerName="extract-utilities" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710419 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerName="extract-utilities" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710433 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29311969-a272-4017-b1a6-be36bf991edd" containerName="dnsmasq-dns" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710440 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="29311969-a272-4017-b1a6-be36bf991edd" containerName="dnsmasq-dns" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710450 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerName="registry-server" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710457 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerName="registry-server" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710481 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerName="registry-server" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710489 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerName="registry-server" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710502 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerName="registry-server" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710509 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerName="registry-server" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710522 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29311969-a272-4017-b1a6-be36bf991edd" containerName="init" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710529 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="29311969-a272-4017-b1a6-be36bf991edd" containerName="init" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710544 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerName="extract-utilities" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710552 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerName="extract-utilities" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710568 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerName="extract-content" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710608 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerName="extract-content" Dec 09 04:35:58 crc kubenswrapper[4766]: E1209 04:35:58.710622 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerName="extract-content" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710630 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerName="extract-content" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710797 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="171fcfd0-caf1-4bbd-9298-c920c0c61de8" containerName="registry-server" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710814 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="29311969-a272-4017-b1a6-be36bf991edd" containerName="dnsmasq-dns" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710832 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ceac5f1-9780-4367-be2c-d71b951a9be9" containerName="registry-server" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.710844 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1615f7d0-334f-4e62-88de-0e7af2962ac4" containerName="registry-server" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.711466 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.718559 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.752136 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w94mz" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.796620 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb7dp\" (UniqueName: \"kubernetes.io/projected/a545d97a-0d76-46fb-bb77-2cc422a5bc4f-kube-api-access-bb7dp\") pod \"mariadb-client-1-default\" (UID: \"a545d97a-0d76-46fb-bb77-2cc422a5bc4f\") " pod="openstack/mariadb-client-1-default" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.897880 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb7dp\" (UniqueName: \"kubernetes.io/projected/a545d97a-0d76-46fb-bb77-2cc422a5bc4f-kube-api-access-bb7dp\") pod \"mariadb-client-1-default\" (UID: \"a545d97a-0d76-46fb-bb77-2cc422a5bc4f\") " pod="openstack/mariadb-client-1-default" Dec 09 04:35:58 crc kubenswrapper[4766]: I1209 04:35:58.927009 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb7dp\" (UniqueName: \"kubernetes.io/projected/a545d97a-0d76-46fb-bb77-2cc422a5bc4f-kube-api-access-bb7dp\") pod \"mariadb-client-1-default\" (UID: \"a545d97a-0d76-46fb-bb77-2cc422a5bc4f\") " pod="openstack/mariadb-client-1-default" Dec 09 04:35:59 crc kubenswrapper[4766]: I1209 04:35:59.072994 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w94mz" Dec 09 04:35:59 crc kubenswrapper[4766]: I1209 04:35:59.079776 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 09 04:35:59 crc kubenswrapper[4766]: I1209 04:35:59.404146 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 04:35:59 crc kubenswrapper[4766]: I1209 04:35:59.980853 4766 generic.go:334] "Generic (PLEG): container finished" podID="a545d97a-0d76-46fb-bb77-2cc422a5bc4f" containerID="ba9fc45a426a1f3d0d8051b462d4187d6027d6ca366c47d41974ed4daac3040a" exitCode=0 Dec 09 04:35:59 crc kubenswrapper[4766]: I1209 04:35:59.980975 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"a545d97a-0d76-46fb-bb77-2cc422a5bc4f","Type":"ContainerDied","Data":"ba9fc45a426a1f3d0d8051b462d4187d6027d6ca366c47d41974ed4daac3040a"} Dec 09 04:35:59 crc kubenswrapper[4766]: I1209 04:35:59.981306 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"a545d97a-0d76-46fb-bb77-2cc422a5bc4f","Type":"ContainerStarted","Data":"413501510b6e271795d6a568acbcd6d47547c8e93f9aa72f4855b3b5eae5be8d"} Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.375872 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.402648 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_a545d97a-0d76-46fb-bb77-2cc422a5bc4f/mariadb-client-1-default/0.log" Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.430072 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.435654 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.538661 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb7dp\" (UniqueName: \"kubernetes.io/projected/a545d97a-0d76-46fb-bb77-2cc422a5bc4f-kube-api-access-bb7dp\") pod \"a545d97a-0d76-46fb-bb77-2cc422a5bc4f\" (UID: \"a545d97a-0d76-46fb-bb77-2cc422a5bc4f\") " Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.544293 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a545d97a-0d76-46fb-bb77-2cc422a5bc4f-kube-api-access-bb7dp" (OuterVolumeSpecName: "kube-api-access-bb7dp") pod "a545d97a-0d76-46fb-bb77-2cc422a5bc4f" (UID: "a545d97a-0d76-46fb-bb77-2cc422a5bc4f"). InnerVolumeSpecName "kube-api-access-bb7dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.641003 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb7dp\" (UniqueName: \"kubernetes.io/projected/a545d97a-0d76-46fb-bb77-2cc422a5bc4f-kube-api-access-bb7dp\") on node \"crc\" DevicePath \"\"" Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.917429 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 04:36:01 crc kubenswrapper[4766]: E1209 04:36:01.918291 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a545d97a-0d76-46fb-bb77-2cc422a5bc4f" containerName="mariadb-client-1-default" Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.918331 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a545d97a-0d76-46fb-bb77-2cc422a5bc4f" containerName="mariadb-client-1-default" Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.918653 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a545d97a-0d76-46fb-bb77-2cc422a5bc4f" containerName="mariadb-client-1-default" Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.919424 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.927467 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.996077 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="413501510b6e271795d6a568acbcd6d47547c8e93f9aa72f4855b3b5eae5be8d" Dec 09 04:36:01 crc kubenswrapper[4766]: I1209 04:36:01.996122 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 09 04:36:02 crc kubenswrapper[4766]: I1209 04:36:02.047537 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/04906dbb-ad58-4029-9c4f-0c79377d14a8-kube-api-access-x47f8\") pod \"mariadb-client-2-default\" (UID: \"04906dbb-ad58-4029-9c4f-0c79377d14a8\") " pod="openstack/mariadb-client-2-default" Dec 09 04:36:02 crc kubenswrapper[4766]: I1209 04:36:02.149499 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/04906dbb-ad58-4029-9c4f-0c79377d14a8-kube-api-access-x47f8\") pod \"mariadb-client-2-default\" (UID: \"04906dbb-ad58-4029-9c4f-0c79377d14a8\") " pod="openstack/mariadb-client-2-default" Dec 09 04:36:02 crc kubenswrapper[4766]: I1209 04:36:02.187967 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/04906dbb-ad58-4029-9c4f-0c79377d14a8-kube-api-access-x47f8\") pod \"mariadb-client-2-default\" (UID: \"04906dbb-ad58-4029-9c4f-0c79377d14a8\") " pod="openstack/mariadb-client-2-default" Dec 09 04:36:02 crc kubenswrapper[4766]: I1209 04:36:02.246930 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 09 04:36:02 crc kubenswrapper[4766]: I1209 04:36:02.849628 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a545d97a-0d76-46fb-bb77-2cc422a5bc4f" path="/var/lib/kubelet/pods/a545d97a-0d76-46fb-bb77-2cc422a5bc4f/volumes" Dec 09 04:36:02 crc kubenswrapper[4766]: I1209 04:36:02.885855 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 04:36:03 crc kubenswrapper[4766]: I1209 04:36:03.003587 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"04906dbb-ad58-4029-9c4f-0c79377d14a8","Type":"ContainerStarted","Data":"ab3af91717b1d1586b9ccb5e6bbdd2c7b8e37a665a32358cb1e791beb6f021a7"} Dec 09 04:36:04 crc kubenswrapper[4766]: I1209 04:36:04.015054 4766 generic.go:334] "Generic (PLEG): container finished" podID="04906dbb-ad58-4029-9c4f-0c79377d14a8" containerID="71dcca40cb34a1b59bf74581df36020017cb2f16db3ba2a071dbd92f9b3650a1" exitCode=1 Dec 09 04:36:04 crc kubenswrapper[4766]: I1209 04:36:04.015126 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"04906dbb-ad58-4029-9c4f-0c79377d14a8","Type":"ContainerDied","Data":"71dcca40cb34a1b59bf74581df36020017cb2f16db3ba2a071dbd92f9b3650a1"} Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.381587 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.405634 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_04906dbb-ad58-4029-9c4f-0c79377d14a8/mariadb-client-2-default/0.log" Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.442276 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.447517 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.504445 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/04906dbb-ad58-4029-9c4f-0c79377d14a8-kube-api-access-x47f8\") pod \"04906dbb-ad58-4029-9c4f-0c79377d14a8\" (UID: \"04906dbb-ad58-4029-9c4f-0c79377d14a8\") " Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.512866 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04906dbb-ad58-4029-9c4f-0c79377d14a8-kube-api-access-x47f8" (OuterVolumeSpecName: "kube-api-access-x47f8") pod "04906dbb-ad58-4029-9c4f-0c79377d14a8" (UID: "04906dbb-ad58-4029-9c4f-0c79377d14a8"). InnerVolumeSpecName "kube-api-access-x47f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.606375 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x47f8\" (UniqueName: \"kubernetes.io/projected/04906dbb-ad58-4029-9c4f-0c79377d14a8-kube-api-access-x47f8\") on node \"crc\" DevicePath \"\"" Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.917600 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 09 04:36:05 crc kubenswrapper[4766]: E1209 04:36:05.917904 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04906dbb-ad58-4029-9c4f-0c79377d14a8" containerName="mariadb-client-2-default" Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.917917 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="04906dbb-ad58-4029-9c4f-0c79377d14a8" containerName="mariadb-client-2-default" Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.918071 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="04906dbb-ad58-4029-9c4f-0c79377d14a8" containerName="mariadb-client-2-default" Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.918686 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 09 04:36:05 crc kubenswrapper[4766]: I1209 04:36:05.930321 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 09 04:36:06 crc kubenswrapper[4766]: I1209 04:36:06.013809 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9gzx\" (UniqueName: \"kubernetes.io/projected/9322e21c-9af2-48c7-b04d-76db3fce0628-kube-api-access-b9gzx\") pod \"mariadb-client-1\" (UID: \"9322e21c-9af2-48c7-b04d-76db3fce0628\") " pod="openstack/mariadb-client-1" Dec 09 04:36:06 crc kubenswrapper[4766]: I1209 04:36:06.032817 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab3af91717b1d1586b9ccb5e6bbdd2c7b8e37a665a32358cb1e791beb6f021a7" Dec 09 04:36:06 crc kubenswrapper[4766]: I1209 04:36:06.032984 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 09 04:36:06 crc kubenswrapper[4766]: I1209 04:36:06.115610 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9gzx\" (UniqueName: \"kubernetes.io/projected/9322e21c-9af2-48c7-b04d-76db3fce0628-kube-api-access-b9gzx\") pod \"mariadb-client-1\" (UID: \"9322e21c-9af2-48c7-b04d-76db3fce0628\") " pod="openstack/mariadb-client-1" Dec 09 04:36:06 crc kubenswrapper[4766]: I1209 04:36:06.134875 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9gzx\" (UniqueName: \"kubernetes.io/projected/9322e21c-9af2-48c7-b04d-76db3fce0628-kube-api-access-b9gzx\") pod \"mariadb-client-1\" (UID: \"9322e21c-9af2-48c7-b04d-76db3fce0628\") " pod="openstack/mariadb-client-1" Dec 09 04:36:06 crc kubenswrapper[4766]: I1209 04:36:06.241545 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 09 04:36:06 crc kubenswrapper[4766]: I1209 04:36:06.742670 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 09 04:36:06 crc kubenswrapper[4766]: W1209 04:36:06.747998 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9322e21c_9af2_48c7_b04d_76db3fce0628.slice/crio-392f36b60967cee4585c6ebb5517462efb90c47d2839ebced4027cb6fc1633e8 WatchSource:0}: Error finding container 392f36b60967cee4585c6ebb5517462efb90c47d2839ebced4027cb6fc1633e8: Status 404 returned error can't find the container with id 392f36b60967cee4585c6ebb5517462efb90c47d2839ebced4027cb6fc1633e8 Dec 09 04:36:06 crc kubenswrapper[4766]: I1209 04:36:06.848736 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04906dbb-ad58-4029-9c4f-0c79377d14a8" path="/var/lib/kubelet/pods/04906dbb-ad58-4029-9c4f-0c79377d14a8/volumes" Dec 09 04:36:07 crc kubenswrapper[4766]: I1209 04:36:07.041816 4766 generic.go:334] "Generic (PLEG): container finished" podID="9322e21c-9af2-48c7-b04d-76db3fce0628" containerID="8623c237f97c8ced28d3e901689c0a067959684b23a85768598dfd7ec56dc090" exitCode=0 Dec 09 04:36:07 crc kubenswrapper[4766]: I1209 04:36:07.041873 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"9322e21c-9af2-48c7-b04d-76db3fce0628","Type":"ContainerDied","Data":"8623c237f97c8ced28d3e901689c0a067959684b23a85768598dfd7ec56dc090"} Dec 09 04:36:07 crc kubenswrapper[4766]: I1209 04:36:07.041912 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"9322e21c-9af2-48c7-b04d-76db3fce0628","Type":"ContainerStarted","Data":"392f36b60967cee4585c6ebb5517462efb90c47d2839ebced4027cb6fc1633e8"} Dec 09 04:36:08 crc kubenswrapper[4766]: I1209 04:36:08.427166 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 09 04:36:08 crc kubenswrapper[4766]: I1209 04:36:08.452527 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_9322e21c-9af2-48c7-b04d-76db3fce0628/mariadb-client-1/0.log" Dec 09 04:36:08 crc kubenswrapper[4766]: I1209 04:36:08.481869 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 09 04:36:08 crc kubenswrapper[4766]: I1209 04:36:08.488458 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 09 04:36:08 crc kubenswrapper[4766]: I1209 04:36:08.549024 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9gzx\" (UniqueName: \"kubernetes.io/projected/9322e21c-9af2-48c7-b04d-76db3fce0628-kube-api-access-b9gzx\") pod \"9322e21c-9af2-48c7-b04d-76db3fce0628\" (UID: \"9322e21c-9af2-48c7-b04d-76db3fce0628\") " Dec 09 04:36:08 crc kubenswrapper[4766]: I1209 04:36:08.560665 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9322e21c-9af2-48c7-b04d-76db3fce0628-kube-api-access-b9gzx" (OuterVolumeSpecName: "kube-api-access-b9gzx") pod "9322e21c-9af2-48c7-b04d-76db3fce0628" (UID: "9322e21c-9af2-48c7-b04d-76db3fce0628"). InnerVolumeSpecName "kube-api-access-b9gzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:36:08 crc kubenswrapper[4766]: I1209 04:36:08.650762 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9gzx\" (UniqueName: \"kubernetes.io/projected/9322e21c-9af2-48c7-b04d-76db3fce0628-kube-api-access-b9gzx\") on node \"crc\" DevicePath \"\"" Dec 09 04:36:08 crc kubenswrapper[4766]: I1209 04:36:08.863525 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9322e21c-9af2-48c7-b04d-76db3fce0628" path="/var/lib/kubelet/pods/9322e21c-9af2-48c7-b04d-76db3fce0628/volumes" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.003688 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 04:36:09 crc kubenswrapper[4766]: E1209 04:36:09.004000 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9322e21c-9af2-48c7-b04d-76db3fce0628" containerName="mariadb-client-1" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.004017 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9322e21c-9af2-48c7-b04d-76db3fce0628" containerName="mariadb-client-1" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.004151 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9322e21c-9af2-48c7-b04d-76db3fce0628" containerName="mariadb-client-1" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.004752 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.029420 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.063928 4766 scope.go:117] "RemoveContainer" containerID="8623c237f97c8ced28d3e901689c0a067959684b23a85768598dfd7ec56dc090" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.063972 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.169049 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59xbd\" (UniqueName: \"kubernetes.io/projected/21dc43cd-3cc2-4c82-b6af-b29e1db5a979-kube-api-access-59xbd\") pod \"mariadb-client-4-default\" (UID: \"21dc43cd-3cc2-4c82-b6af-b29e1db5a979\") " pod="openstack/mariadb-client-4-default" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.270318 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59xbd\" (UniqueName: \"kubernetes.io/projected/21dc43cd-3cc2-4c82-b6af-b29e1db5a979-kube-api-access-59xbd\") pod \"mariadb-client-4-default\" (UID: \"21dc43cd-3cc2-4c82-b6af-b29e1db5a979\") " pod="openstack/mariadb-client-4-default" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.294261 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59xbd\" (UniqueName: \"kubernetes.io/projected/21dc43cd-3cc2-4c82-b6af-b29e1db5a979-kube-api-access-59xbd\") pod \"mariadb-client-4-default\" (UID: \"21dc43cd-3cc2-4c82-b6af-b29e1db5a979\") " pod="openstack/mariadb-client-4-default" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.329801 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 09 04:36:09 crc kubenswrapper[4766]: I1209 04:36:09.840527 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 04:36:09 crc kubenswrapper[4766]: W1209 04:36:09.846328 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21dc43cd_3cc2_4c82_b6af_b29e1db5a979.slice/crio-246969967b1b5a5108cd90fae1b38785f397bce824a9ffd9f2af0ae7d94fa7a6 WatchSource:0}: Error finding container 246969967b1b5a5108cd90fae1b38785f397bce824a9ffd9f2af0ae7d94fa7a6: Status 404 returned error can't find the container with id 246969967b1b5a5108cd90fae1b38785f397bce824a9ffd9f2af0ae7d94fa7a6 Dec 09 04:36:10 crc kubenswrapper[4766]: I1209 04:36:10.072740 4766 generic.go:334] "Generic (PLEG): container finished" podID="21dc43cd-3cc2-4c82-b6af-b29e1db5a979" containerID="6f4237376e57135236fdb5193512b4d01f15a4ad09623d20ad3b43ca377c30ef" exitCode=0 Dec 09 04:36:10 crc kubenswrapper[4766]: I1209 04:36:10.072846 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"21dc43cd-3cc2-4c82-b6af-b29e1db5a979","Type":"ContainerDied","Data":"6f4237376e57135236fdb5193512b4d01f15a4ad09623d20ad3b43ca377c30ef"} Dec 09 04:36:10 crc kubenswrapper[4766]: I1209 04:36:10.072901 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"21dc43cd-3cc2-4c82-b6af-b29e1db5a979","Type":"ContainerStarted","Data":"246969967b1b5a5108cd90fae1b38785f397bce824a9ffd9f2af0ae7d94fa7a6"} Dec 09 04:36:10 crc kubenswrapper[4766]: I1209 04:36:10.840728 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:36:10 crc kubenswrapper[4766]: E1209 04:36:10.841908 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:36:11 crc kubenswrapper[4766]: I1209 04:36:11.418428 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 09 04:36:11 crc kubenswrapper[4766]: I1209 04:36:11.436195 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_21dc43cd-3cc2-4c82-b6af-b29e1db5a979/mariadb-client-4-default/0.log" Dec 09 04:36:11 crc kubenswrapper[4766]: I1209 04:36:11.469858 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 04:36:11 crc kubenswrapper[4766]: I1209 04:36:11.477961 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 09 04:36:11 crc kubenswrapper[4766]: I1209 04:36:11.605746 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59xbd\" (UniqueName: \"kubernetes.io/projected/21dc43cd-3cc2-4c82-b6af-b29e1db5a979-kube-api-access-59xbd\") pod \"21dc43cd-3cc2-4c82-b6af-b29e1db5a979\" (UID: \"21dc43cd-3cc2-4c82-b6af-b29e1db5a979\") " Dec 09 04:36:11 crc kubenswrapper[4766]: I1209 04:36:11.614568 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21dc43cd-3cc2-4c82-b6af-b29e1db5a979-kube-api-access-59xbd" (OuterVolumeSpecName: "kube-api-access-59xbd") pod "21dc43cd-3cc2-4c82-b6af-b29e1db5a979" (UID: "21dc43cd-3cc2-4c82-b6af-b29e1db5a979"). InnerVolumeSpecName "kube-api-access-59xbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:36:11 crc kubenswrapper[4766]: I1209 04:36:11.707644 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59xbd\" (UniqueName: \"kubernetes.io/projected/21dc43cd-3cc2-4c82-b6af-b29e1db5a979-kube-api-access-59xbd\") on node \"crc\" DevicePath \"\"" Dec 09 04:36:12 crc kubenswrapper[4766]: I1209 04:36:12.090275 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="246969967b1b5a5108cd90fae1b38785f397bce824a9ffd9f2af0ae7d94fa7a6" Dec 09 04:36:12 crc kubenswrapper[4766]: I1209 04:36:12.090310 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 09 04:36:12 crc kubenswrapper[4766]: I1209 04:36:12.853590 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21dc43cd-3cc2-4c82-b6af-b29e1db5a979" path="/var/lib/kubelet/pods/21dc43cd-3cc2-4c82-b6af-b29e1db5a979/volumes" Dec 09 04:36:13 crc kubenswrapper[4766]: I1209 04:36:13.545152 4766 scope.go:117] "RemoveContainer" containerID="1f4207a37955764890b3f2af640a5090f8ea902f8d4667ed6c87aa2742155685" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.080297 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 04:36:16 crc kubenswrapper[4766]: E1209 04:36:16.081148 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dc43cd-3cc2-4c82-b6af-b29e1db5a979" containerName="mariadb-client-4-default" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.081168 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dc43cd-3cc2-4c82-b6af-b29e1db5a979" containerName="mariadb-client-4-default" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.081391 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="21dc43cd-3cc2-4c82-b6af-b29e1db5a979" containerName="mariadb-client-4-default" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.082007 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.084903 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w94mz" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.091149 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.212396 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dfqj\" (UniqueName: \"kubernetes.io/projected/62848099-e470-4866-a0a9-7f0ec2c9e48e-kube-api-access-6dfqj\") pod \"mariadb-client-5-default\" (UID: \"62848099-e470-4866-a0a9-7f0ec2c9e48e\") " pod="openstack/mariadb-client-5-default" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.314538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dfqj\" (UniqueName: \"kubernetes.io/projected/62848099-e470-4866-a0a9-7f0ec2c9e48e-kube-api-access-6dfqj\") pod \"mariadb-client-5-default\" (UID: \"62848099-e470-4866-a0a9-7f0ec2c9e48e\") " pod="openstack/mariadb-client-5-default" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.352202 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dfqj\" (UniqueName: \"kubernetes.io/projected/62848099-e470-4866-a0a9-7f0ec2c9e48e-kube-api-access-6dfqj\") pod \"mariadb-client-5-default\" (UID: \"62848099-e470-4866-a0a9-7f0ec2c9e48e\") " pod="openstack/mariadb-client-5-default" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.414059 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 09 04:36:16 crc kubenswrapper[4766]: I1209 04:36:16.909718 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 04:36:17 crc kubenswrapper[4766]: I1209 04:36:17.132508 4766 generic.go:334] "Generic (PLEG): container finished" podID="62848099-e470-4866-a0a9-7f0ec2c9e48e" containerID="c52e25fbdbb89e8e7d851e9712dce246855714f5ef03c1a8effd6b1b8ab14046" exitCode=0 Dec 09 04:36:17 crc kubenswrapper[4766]: I1209 04:36:17.132547 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"62848099-e470-4866-a0a9-7f0ec2c9e48e","Type":"ContainerDied","Data":"c52e25fbdbb89e8e7d851e9712dce246855714f5ef03c1a8effd6b1b8ab14046"} Dec 09 04:36:17 crc kubenswrapper[4766]: I1209 04:36:17.132597 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"62848099-e470-4866-a0a9-7f0ec2c9e48e","Type":"ContainerStarted","Data":"ccae66bf0f5626a184d738f7affb6188feea8ecad95b4b237282f5f550723e09"} Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.546679 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.568331 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_62848099-e470-4866-a0a9-7f0ec2c9e48e/mariadb-client-5-default/0.log" Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.601382 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.609453 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.651263 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dfqj\" (UniqueName: \"kubernetes.io/projected/62848099-e470-4866-a0a9-7f0ec2c9e48e-kube-api-access-6dfqj\") pod \"62848099-e470-4866-a0a9-7f0ec2c9e48e\" (UID: \"62848099-e470-4866-a0a9-7f0ec2c9e48e\") " Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.658615 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62848099-e470-4866-a0a9-7f0ec2c9e48e-kube-api-access-6dfqj" (OuterVolumeSpecName: "kube-api-access-6dfqj") pod "62848099-e470-4866-a0a9-7f0ec2c9e48e" (UID: "62848099-e470-4866-a0a9-7f0ec2c9e48e"). InnerVolumeSpecName "kube-api-access-6dfqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.753953 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dfqj\" (UniqueName: \"kubernetes.io/projected/62848099-e470-4866-a0a9-7f0ec2c9e48e-kube-api-access-6dfqj\") on node \"crc\" DevicePath \"\"" Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.781126 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 04:36:18 crc kubenswrapper[4766]: E1209 04:36:18.784650 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62848099-e470-4866-a0a9-7f0ec2c9e48e" containerName="mariadb-client-5-default" Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.784688 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="62848099-e470-4866-a0a9-7f0ec2c9e48e" containerName="mariadb-client-5-default" Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.784939 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="62848099-e470-4866-a0a9-7f0ec2c9e48e" containerName="mariadb-client-5-default" Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.790183 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.792771 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.854483 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62848099-e470-4866-a0a9-7f0ec2c9e48e" path="/var/lib/kubelet/pods/62848099-e470-4866-a0a9-7f0ec2c9e48e/volumes" Dec 09 04:36:18 crc kubenswrapper[4766]: I1209 04:36:18.957594 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sx8s\" (UniqueName: \"kubernetes.io/projected/ab818b8e-cd25-453a-9242-f18453a3a55b-kube-api-access-8sx8s\") pod \"mariadb-client-6-default\" (UID: \"ab818b8e-cd25-453a-9242-f18453a3a55b\") " pod="openstack/mariadb-client-6-default" Dec 09 04:36:19 crc kubenswrapper[4766]: I1209 04:36:19.059639 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sx8s\" (UniqueName: \"kubernetes.io/projected/ab818b8e-cd25-453a-9242-f18453a3a55b-kube-api-access-8sx8s\") pod \"mariadb-client-6-default\" (UID: \"ab818b8e-cd25-453a-9242-f18453a3a55b\") " pod="openstack/mariadb-client-6-default" Dec 09 04:36:19 crc kubenswrapper[4766]: I1209 04:36:19.078903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sx8s\" (UniqueName: \"kubernetes.io/projected/ab818b8e-cd25-453a-9242-f18453a3a55b-kube-api-access-8sx8s\") pod \"mariadb-client-6-default\" (UID: \"ab818b8e-cd25-453a-9242-f18453a3a55b\") " pod="openstack/mariadb-client-6-default" Dec 09 04:36:19 crc kubenswrapper[4766]: I1209 04:36:19.152173 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 09 04:36:19 crc kubenswrapper[4766]: I1209 04:36:19.153417 4766 scope.go:117] "RemoveContainer" containerID="c52e25fbdbb89e8e7d851e9712dce246855714f5ef03c1a8effd6b1b8ab14046" Dec 09 04:36:19 crc kubenswrapper[4766]: I1209 04:36:19.153445 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 09 04:36:19 crc kubenswrapper[4766]: I1209 04:36:19.677201 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 04:36:20 crc kubenswrapper[4766]: I1209 04:36:20.162033 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"ab818b8e-cd25-453a-9242-f18453a3a55b","Type":"ContainerStarted","Data":"43df571e93caec50c957c2caad8db4c9c17523bd165d1790d291668ea3494314"} Dec 09 04:36:20 crc kubenswrapper[4766]: I1209 04:36:20.162465 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"ab818b8e-cd25-453a-9242-f18453a3a55b","Type":"ContainerStarted","Data":"d58c7776fa98c77ad66f888164353ce131708be28992d2214861f6489ff49edf"} Dec 09 04:36:20 crc kubenswrapper[4766]: I1209 04:36:20.183107 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=2.183083595 podStartE2EDuration="2.183083595s" podCreationTimestamp="2025-12-09 04:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:36:20.178238613 +0000 UTC m=+5061.887544049" watchObservedRunningTime="2025-12-09 04:36:20.183083595 +0000 UTC m=+5061.892389041" Dec 09 04:36:21 crc kubenswrapper[4766]: I1209 04:36:21.174771 4766 generic.go:334] "Generic (PLEG): container finished" podID="ab818b8e-cd25-453a-9242-f18453a3a55b" containerID="43df571e93caec50c957c2caad8db4c9c17523bd165d1790d291668ea3494314" exitCode=1 Dec 09 04:36:21 crc kubenswrapper[4766]: I1209 04:36:21.174858 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"ab818b8e-cd25-453a-9242-f18453a3a55b","Type":"ContainerDied","Data":"43df571e93caec50c957c2caad8db4c9c17523bd165d1790d291668ea3494314"} Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.648949 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.693323 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.697935 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.817796 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sx8s\" (UniqueName: \"kubernetes.io/projected/ab818b8e-cd25-453a-9242-f18453a3a55b-kube-api-access-8sx8s\") pod \"ab818b8e-cd25-453a-9242-f18453a3a55b\" (UID: \"ab818b8e-cd25-453a-9242-f18453a3a55b\") " Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.826523 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab818b8e-cd25-453a-9242-f18453a3a55b-kube-api-access-8sx8s" (OuterVolumeSpecName: "kube-api-access-8sx8s") pod "ab818b8e-cd25-453a-9242-f18453a3a55b" (UID: "ab818b8e-cd25-453a-9242-f18453a3a55b"). InnerVolumeSpecName "kube-api-access-8sx8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.849231 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab818b8e-cd25-453a-9242-f18453a3a55b" path="/var/lib/kubelet/pods/ab818b8e-cd25-453a-9242-f18453a3a55b/volumes" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.882163 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 04:36:23 crc kubenswrapper[4766]: E1209 04:36:22.882752 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab818b8e-cd25-453a-9242-f18453a3a55b" containerName="mariadb-client-6-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.882768 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab818b8e-cd25-453a-9242-f18453a3a55b" containerName="mariadb-client-6-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.882937 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab818b8e-cd25-453a-9242-f18453a3a55b" containerName="mariadb-client-6-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.883455 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.892251 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:22.919878 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sx8s\" (UniqueName: \"kubernetes.io/projected/ab818b8e-cd25-453a-9242-f18453a3a55b-kube-api-access-8sx8s\") on node \"crc\" DevicePath \"\"" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:23.021437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/ba3762e5-3f62-42c1-93f7-02a027853a81-kube-api-access-zmvzz\") pod \"mariadb-client-7-default\" (UID: \"ba3762e5-3f62-42c1-93f7-02a027853a81\") " pod="openstack/mariadb-client-7-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:23.123910 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/ba3762e5-3f62-42c1-93f7-02a027853a81-kube-api-access-zmvzz\") pod \"mariadb-client-7-default\" (UID: \"ba3762e5-3f62-42c1-93f7-02a027853a81\") " pod="openstack/mariadb-client-7-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:23.173077 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/ba3762e5-3f62-42c1-93f7-02a027853a81-kube-api-access-zmvzz\") pod \"mariadb-client-7-default\" (UID: \"ba3762e5-3f62-42c1-93f7-02a027853a81\") " pod="openstack/mariadb-client-7-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:23.203063 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:23.205318 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:23.205339 4766 scope.go:117] "RemoveContainer" containerID="43df571e93caec50c957c2caad8db4c9c17523bd165d1790d291668ea3494314" Dec 09 04:36:23 crc kubenswrapper[4766]: I1209 04:36:23.714228 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 04:36:24 crc kubenswrapper[4766]: I1209 04:36:24.218154 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba3762e5-3f62-42c1-93f7-02a027853a81" containerID="5f3352328b8a9498de6d80ab3dd6bc6dae3b08ddc1cb9f4a336e62d703ed8337" exitCode=0 Dec 09 04:36:24 crc kubenswrapper[4766]: I1209 04:36:24.218278 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"ba3762e5-3f62-42c1-93f7-02a027853a81","Type":"ContainerDied","Data":"5f3352328b8a9498de6d80ab3dd6bc6dae3b08ddc1cb9f4a336e62d703ed8337"} Dec 09 04:36:24 crc kubenswrapper[4766]: I1209 04:36:24.218311 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"ba3762e5-3f62-42c1-93f7-02a027853a81","Type":"ContainerStarted","Data":"717ab441df9e47409db332ffd63a691558e0e04f1fb1296f75105a35058b75e5"} Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.704779 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.727801 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_ba3762e5-3f62-42c1-93f7-02a027853a81/mariadb-client-7-default/0.log" Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.756829 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.766065 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.839510 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:36:25 crc kubenswrapper[4766]: E1209 04:36:25.840069 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.870209 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/ba3762e5-3f62-42c1-93f7-02a027853a81-kube-api-access-zmvzz\") pod \"ba3762e5-3f62-42c1-93f7-02a027853a81\" (UID: \"ba3762e5-3f62-42c1-93f7-02a027853a81\") " Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.879389 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3762e5-3f62-42c1-93f7-02a027853a81-kube-api-access-zmvzz" (OuterVolumeSpecName: "kube-api-access-zmvzz") pod "ba3762e5-3f62-42c1-93f7-02a027853a81" (UID: "ba3762e5-3f62-42c1-93f7-02a027853a81"). InnerVolumeSpecName "kube-api-access-zmvzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.936989 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 09 04:36:25 crc kubenswrapper[4766]: E1209 04:36:25.937979 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3762e5-3f62-42c1-93f7-02a027853a81" containerName="mariadb-client-7-default" Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.938014 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3762e5-3f62-42c1-93f7-02a027853a81" containerName="mariadb-client-7-default" Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.938326 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3762e5-3f62-42c1-93f7-02a027853a81" containerName="mariadb-client-7-default" Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.939296 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.952565 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 09 04:36:25 crc kubenswrapper[4766]: I1209 04:36:25.973418 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/ba3762e5-3f62-42c1-93f7-02a027853a81-kube-api-access-zmvzz\") on node \"crc\" DevicePath \"\"" Dec 09 04:36:26 crc kubenswrapper[4766]: I1209 04:36:26.074792 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vqpv\" (UniqueName: \"kubernetes.io/projected/ad1ef16e-1425-476e-b2ed-f5ca126a731e-kube-api-access-5vqpv\") pod \"mariadb-client-2\" (UID: \"ad1ef16e-1425-476e-b2ed-f5ca126a731e\") " pod="openstack/mariadb-client-2" Dec 09 04:36:26 crc kubenswrapper[4766]: I1209 04:36:26.175920 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqpv\" (UniqueName: \"kubernetes.io/projected/ad1ef16e-1425-476e-b2ed-f5ca126a731e-kube-api-access-5vqpv\") pod \"mariadb-client-2\" (UID: \"ad1ef16e-1425-476e-b2ed-f5ca126a731e\") " pod="openstack/mariadb-client-2" Dec 09 04:36:26 crc kubenswrapper[4766]: I1209 04:36:26.195869 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vqpv\" (UniqueName: \"kubernetes.io/projected/ad1ef16e-1425-476e-b2ed-f5ca126a731e-kube-api-access-5vqpv\") pod \"mariadb-client-2\" (UID: \"ad1ef16e-1425-476e-b2ed-f5ca126a731e\") " pod="openstack/mariadb-client-2" Dec 09 04:36:26 crc kubenswrapper[4766]: I1209 04:36:26.243362 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="717ab441df9e47409db332ffd63a691558e0e04f1fb1296f75105a35058b75e5" Dec 09 04:36:26 crc kubenswrapper[4766]: I1209 04:36:26.243414 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 09 04:36:26 crc kubenswrapper[4766]: I1209 04:36:26.265602 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 09 04:36:26 crc kubenswrapper[4766]: I1209 04:36:26.595485 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 09 04:36:26 crc kubenswrapper[4766]: I1209 04:36:26.850968 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3762e5-3f62-42c1-93f7-02a027853a81" path="/var/lib/kubelet/pods/ba3762e5-3f62-42c1-93f7-02a027853a81/volumes" Dec 09 04:36:27 crc kubenswrapper[4766]: I1209 04:36:27.257157 4766 generic.go:334] "Generic (PLEG): container finished" podID="ad1ef16e-1425-476e-b2ed-f5ca126a731e" containerID="a54071b40562abe95abb7f70cc61341865957c7329bb0cd8ef83acae5562aefc" exitCode=0 Dec 09 04:36:27 crc kubenswrapper[4766]: I1209 04:36:27.257266 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"ad1ef16e-1425-476e-b2ed-f5ca126a731e","Type":"ContainerDied","Data":"a54071b40562abe95abb7f70cc61341865957c7329bb0cd8ef83acae5562aefc"} Dec 09 04:36:27 crc kubenswrapper[4766]: I1209 04:36:27.257593 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"ad1ef16e-1425-476e-b2ed-f5ca126a731e","Type":"ContainerStarted","Data":"76a0702f10790a993ad3e86035d6870dcbcd50de3305e83018ef02c4fe314d5a"} Dec 09 04:36:28 crc kubenswrapper[4766]: I1209 04:36:28.664762 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 09 04:36:28 crc kubenswrapper[4766]: I1209 04:36:28.686038 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_ad1ef16e-1425-476e-b2ed-f5ca126a731e/mariadb-client-2/0.log" Dec 09 04:36:28 crc kubenswrapper[4766]: I1209 04:36:28.713122 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 09 04:36:28 crc kubenswrapper[4766]: I1209 04:36:28.719002 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 09 04:36:28 crc kubenswrapper[4766]: I1209 04:36:28.819139 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vqpv\" (UniqueName: \"kubernetes.io/projected/ad1ef16e-1425-476e-b2ed-f5ca126a731e-kube-api-access-5vqpv\") pod \"ad1ef16e-1425-476e-b2ed-f5ca126a731e\" (UID: \"ad1ef16e-1425-476e-b2ed-f5ca126a731e\") " Dec 09 04:36:28 crc kubenswrapper[4766]: I1209 04:36:28.826431 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1ef16e-1425-476e-b2ed-f5ca126a731e-kube-api-access-5vqpv" (OuterVolumeSpecName: "kube-api-access-5vqpv") pod "ad1ef16e-1425-476e-b2ed-f5ca126a731e" (UID: "ad1ef16e-1425-476e-b2ed-f5ca126a731e"). InnerVolumeSpecName "kube-api-access-5vqpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:36:28 crc kubenswrapper[4766]: I1209 04:36:28.847530 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad1ef16e-1425-476e-b2ed-f5ca126a731e" path="/var/lib/kubelet/pods/ad1ef16e-1425-476e-b2ed-f5ca126a731e/volumes" Dec 09 04:36:28 crc kubenswrapper[4766]: I1209 04:36:28.920487 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vqpv\" (UniqueName: \"kubernetes.io/projected/ad1ef16e-1425-476e-b2ed-f5ca126a731e-kube-api-access-5vqpv\") on node \"crc\" DevicePath \"\"" Dec 09 04:36:29 crc kubenswrapper[4766]: I1209 04:36:29.281403 4766 scope.go:117] "RemoveContainer" containerID="a54071b40562abe95abb7f70cc61341865957c7329bb0cd8ef83acae5562aefc" Dec 09 04:36:29 crc kubenswrapper[4766]: I1209 04:36:29.281430 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 09 04:36:40 crc kubenswrapper[4766]: I1209 04:36:40.839094 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:36:40 crc kubenswrapper[4766]: E1209 04:36:40.841063 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:36:51 crc kubenswrapper[4766]: I1209 04:36:51.840453 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:36:51 crc kubenswrapper[4766]: E1209 04:36:51.841630 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:37:06 crc kubenswrapper[4766]: I1209 04:37:06.839199 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:37:06 crc kubenswrapper[4766]: E1209 04:37:06.840279 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:37:19 crc kubenswrapper[4766]: I1209 04:37:19.839088 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:37:19 crc kubenswrapper[4766]: E1209 04:37:19.839894 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:37:34 crc kubenswrapper[4766]: I1209 04:37:34.839609 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:37:34 crc kubenswrapper[4766]: E1209 04:37:34.840586 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:37:47 crc kubenswrapper[4766]: I1209 04:37:47.840109 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:37:47 crc kubenswrapper[4766]: E1209 04:37:47.841206 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:38:00 crc kubenswrapper[4766]: I1209 04:38:00.840089 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:38:00 crc kubenswrapper[4766]: E1209 04:38:00.841117 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:38:12 crc kubenswrapper[4766]: I1209 04:38:12.839946 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:38:12 crc kubenswrapper[4766]: E1209 04:38:12.841585 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:38:24 crc kubenswrapper[4766]: I1209 04:38:24.839834 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:38:24 crc kubenswrapper[4766]: E1209 04:38:24.842552 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:38:37 crc kubenswrapper[4766]: I1209 04:38:37.839846 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:38:37 crc kubenswrapper[4766]: E1209 04:38:37.840861 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:38:50 crc kubenswrapper[4766]: I1209 04:38:50.842075 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:38:50 crc kubenswrapper[4766]: E1209 04:38:50.845460 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:39:04 crc kubenswrapper[4766]: I1209 04:39:04.840027 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:39:04 crc kubenswrapper[4766]: E1209 04:39:04.841202 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:39:17 crc kubenswrapper[4766]: I1209 04:39:17.840520 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:39:17 crc kubenswrapper[4766]: E1209 04:39:17.841753 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:39:32 crc kubenswrapper[4766]: I1209 04:39:32.839682 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:39:32 crc kubenswrapper[4766]: E1209 04:39:32.840451 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:39:44 crc kubenswrapper[4766]: I1209 04:39:44.839674 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:39:45 crc kubenswrapper[4766]: I1209 04:39:45.160858 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"caec6864870b555babc6bdf649e5fc84b874ca528991ac1003b50ad5ab1fca38"} Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.612326 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 04:41:13 crc kubenswrapper[4766]: E1209 04:41:13.613133 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1ef16e-1425-476e-b2ed-f5ca126a731e" containerName="mariadb-client-2" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.613146 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1ef16e-1425-476e-b2ed-f5ca126a731e" containerName="mariadb-client-2" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.613311 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1ef16e-1425-476e-b2ed-f5ca126a731e" containerName="mariadb-client-2" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.613884 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.616196 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-w94mz" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.621852 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.668093 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\") pod \"mariadb-copy-data\" (UID: \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\") " pod="openstack/mariadb-copy-data" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.668245 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k78s\" (UniqueName: \"kubernetes.io/projected/1900c1e7-52f3-47a4-933d-84b2816bf2e8-kube-api-access-5k78s\") pod \"mariadb-copy-data\" (UID: \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\") " pod="openstack/mariadb-copy-data" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.770443 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k78s\" (UniqueName: \"kubernetes.io/projected/1900c1e7-52f3-47a4-933d-84b2816bf2e8-kube-api-access-5k78s\") pod \"mariadb-copy-data\" (UID: \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\") " pod="openstack/mariadb-copy-data" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.770587 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\") pod \"mariadb-copy-data\" (UID: \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\") " pod="openstack/mariadb-copy-data" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.773410 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.773447 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\") pod \"mariadb-copy-data\" (UID: \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/56b22f24ed70d24a2aead65327c7f356ba6f13f66da7fe0537b768d2c2d40c5a/globalmount\"" pod="openstack/mariadb-copy-data" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.789607 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k78s\" (UniqueName: \"kubernetes.io/projected/1900c1e7-52f3-47a4-933d-84b2816bf2e8-kube-api-access-5k78s\") pod \"mariadb-copy-data\" (UID: \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\") " pod="openstack/mariadb-copy-data" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.801328 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\") pod \"mariadb-copy-data\" (UID: \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\") " pod="openstack/mariadb-copy-data" Dec 09 04:41:13 crc kubenswrapper[4766]: I1209 04:41:13.931722 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 09 04:41:14 crc kubenswrapper[4766]: I1209 04:41:14.430001 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 04:41:15 crc kubenswrapper[4766]: I1209 04:41:15.033915 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1900c1e7-52f3-47a4-933d-84b2816bf2e8","Type":"ContainerStarted","Data":"687a52b4967a2e88c68dbdbfccfd77b69ca1c9da06e35498ff2363c2f7d7d096"} Dec 09 04:41:15 crc kubenswrapper[4766]: I1209 04:41:15.034481 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1900c1e7-52f3-47a4-933d-84b2816bf2e8","Type":"ContainerStarted","Data":"b42181778f66570a909cf354a41016fea4918cb35792321e397d8edcbab4adba"} Dec 09 04:41:15 crc kubenswrapper[4766]: I1209 04:41:15.055180 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.055155843 podStartE2EDuration="3.055155843s" podCreationTimestamp="2025-12-09 04:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:41:15.048963734 +0000 UTC m=+5356.758269210" watchObservedRunningTime="2025-12-09 04:41:15.055155843 +0000 UTC m=+5356.764461309" Dec 09 04:41:18 crc kubenswrapper[4766]: I1209 04:41:18.329436 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:18 crc kubenswrapper[4766]: I1209 04:41:18.331023 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 04:41:18 crc kubenswrapper[4766]: I1209 04:41:18.342702 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:18 crc kubenswrapper[4766]: I1209 04:41:18.351299 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc57t\" (UniqueName: \"kubernetes.io/projected/4ea336ce-d8c8-4395-8236-02d0ed24ff46-kube-api-access-lc57t\") pod \"mariadb-client\" (UID: \"4ea336ce-d8c8-4395-8236-02d0ed24ff46\") " pod="openstack/mariadb-client" Dec 09 04:41:18 crc kubenswrapper[4766]: I1209 04:41:18.452180 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc57t\" (UniqueName: \"kubernetes.io/projected/4ea336ce-d8c8-4395-8236-02d0ed24ff46-kube-api-access-lc57t\") pod \"mariadb-client\" (UID: \"4ea336ce-d8c8-4395-8236-02d0ed24ff46\") " pod="openstack/mariadb-client" Dec 09 04:41:18 crc kubenswrapper[4766]: I1209 04:41:18.479796 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc57t\" (UniqueName: \"kubernetes.io/projected/4ea336ce-d8c8-4395-8236-02d0ed24ff46-kube-api-access-lc57t\") pod \"mariadb-client\" (UID: \"4ea336ce-d8c8-4395-8236-02d0ed24ff46\") " pod="openstack/mariadb-client" Dec 09 04:41:18 crc kubenswrapper[4766]: I1209 04:41:18.670349 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 04:41:19 crc kubenswrapper[4766]: I1209 04:41:19.099770 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:19 crc kubenswrapper[4766]: W1209 04:41:19.105131 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ea336ce_d8c8_4395_8236_02d0ed24ff46.slice/crio-17e3ad239d053351c3e3b36d942d44129f17a9dff3e4ffe3841f733ccbc8eb27 WatchSource:0}: Error finding container 17e3ad239d053351c3e3b36d942d44129f17a9dff3e4ffe3841f733ccbc8eb27: Status 404 returned error can't find the container with id 17e3ad239d053351c3e3b36d942d44129f17a9dff3e4ffe3841f733ccbc8eb27 Dec 09 04:41:20 crc kubenswrapper[4766]: I1209 04:41:20.086766 4766 generic.go:334] "Generic (PLEG): container finished" podID="4ea336ce-d8c8-4395-8236-02d0ed24ff46" containerID="46e4b938be7b973d63f3a22a6fe4b8fb9c73070336a0c4297b06ebcc79f343d9" exitCode=0 Dec 09 04:41:20 crc kubenswrapper[4766]: I1209 04:41:20.086880 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4ea336ce-d8c8-4395-8236-02d0ed24ff46","Type":"ContainerDied","Data":"46e4b938be7b973d63f3a22a6fe4b8fb9c73070336a0c4297b06ebcc79f343d9"} Dec 09 04:41:20 crc kubenswrapper[4766]: I1209 04:41:20.087135 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4ea336ce-d8c8-4395-8236-02d0ed24ff46","Type":"ContainerStarted","Data":"17e3ad239d053351c3e3b36d942d44129f17a9dff3e4ffe3841f733ccbc8eb27"} Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.471428 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.495090 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_4ea336ce-d8c8-4395-8236-02d0ed24ff46/mariadb-client/0.log" Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.502997 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc57t\" (UniqueName: \"kubernetes.io/projected/4ea336ce-d8c8-4395-8236-02d0ed24ff46-kube-api-access-lc57t\") pod \"4ea336ce-d8c8-4395-8236-02d0ed24ff46\" (UID: \"4ea336ce-d8c8-4395-8236-02d0ed24ff46\") " Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.509487 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea336ce-d8c8-4395-8236-02d0ed24ff46-kube-api-access-lc57t" (OuterVolumeSpecName: "kube-api-access-lc57t") pod "4ea336ce-d8c8-4395-8236-02d0ed24ff46" (UID: "4ea336ce-d8c8-4395-8236-02d0ed24ff46"). InnerVolumeSpecName "kube-api-access-lc57t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.528585 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.536710 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.604486 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc57t\" (UniqueName: \"kubernetes.io/projected/4ea336ce-d8c8-4395-8236-02d0ed24ff46-kube-api-access-lc57t\") on node \"crc\" DevicePath \"\"" Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.733298 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:21 crc kubenswrapper[4766]: E1209 04:41:21.733685 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea336ce-d8c8-4395-8236-02d0ed24ff46" containerName="mariadb-client" Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.733700 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea336ce-d8c8-4395-8236-02d0ed24ff46" containerName="mariadb-client" Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.733843 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea336ce-d8c8-4395-8236-02d0ed24ff46" containerName="mariadb-client" Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.734481 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.740778 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:21 crc kubenswrapper[4766]: I1209 04:41:21.909774 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8cb4\" (UniqueName: \"kubernetes.io/projected/b078d080-be05-4292-a2f4-a681effddc06-kube-api-access-g8cb4\") pod \"mariadb-client\" (UID: \"b078d080-be05-4292-a2f4-a681effddc06\") " pod="openstack/mariadb-client" Dec 09 04:41:22 crc kubenswrapper[4766]: I1209 04:41:22.011005 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8cb4\" (UniqueName: \"kubernetes.io/projected/b078d080-be05-4292-a2f4-a681effddc06-kube-api-access-g8cb4\") pod \"mariadb-client\" (UID: \"b078d080-be05-4292-a2f4-a681effddc06\") " pod="openstack/mariadb-client" Dec 09 04:41:22 crc kubenswrapper[4766]: I1209 04:41:22.039574 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8cb4\" (UniqueName: \"kubernetes.io/projected/b078d080-be05-4292-a2f4-a681effddc06-kube-api-access-g8cb4\") pod \"mariadb-client\" (UID: \"b078d080-be05-4292-a2f4-a681effddc06\") " pod="openstack/mariadb-client" Dec 09 04:41:22 crc kubenswrapper[4766]: I1209 04:41:22.058771 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 04:41:22 crc kubenswrapper[4766]: I1209 04:41:22.110755 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17e3ad239d053351c3e3b36d942d44129f17a9dff3e4ffe3841f733ccbc8eb27" Dec 09 04:41:22 crc kubenswrapper[4766]: I1209 04:41:22.110862 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 04:41:22 crc kubenswrapper[4766]: I1209 04:41:22.128898 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="4ea336ce-d8c8-4395-8236-02d0ed24ff46" podUID="b078d080-be05-4292-a2f4-a681effddc06" Dec 09 04:41:22 crc kubenswrapper[4766]: I1209 04:41:22.335677 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:22 crc kubenswrapper[4766]: W1209 04:41:22.336909 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb078d080_be05_4292_a2f4_a681effddc06.slice/crio-0d342becb8c9079d473b6570d7483dbf8aaa74af11271eb5d24e4c4e144d7896 WatchSource:0}: Error finding container 0d342becb8c9079d473b6570d7483dbf8aaa74af11271eb5d24e4c4e144d7896: Status 404 returned error can't find the container with id 0d342becb8c9079d473b6570d7483dbf8aaa74af11271eb5d24e4c4e144d7896 Dec 09 04:41:22 crc kubenswrapper[4766]: I1209 04:41:22.851924 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea336ce-d8c8-4395-8236-02d0ed24ff46" path="/var/lib/kubelet/pods/4ea336ce-d8c8-4395-8236-02d0ed24ff46/volumes" Dec 09 04:41:23 crc kubenswrapper[4766]: I1209 04:41:23.122396 4766 generic.go:334] "Generic (PLEG): container finished" podID="b078d080-be05-4292-a2f4-a681effddc06" containerID="62fef8acebfe22e99f9cd04a6c48e0247f77e8ba818059220a70f58b9934bb84" exitCode=0 Dec 09 04:41:23 crc kubenswrapper[4766]: I1209 04:41:23.122467 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b078d080-be05-4292-a2f4-a681effddc06","Type":"ContainerDied","Data":"62fef8acebfe22e99f9cd04a6c48e0247f77e8ba818059220a70f58b9934bb84"} Dec 09 04:41:23 crc kubenswrapper[4766]: I1209 04:41:23.122547 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b078d080-be05-4292-a2f4-a681effddc06","Type":"ContainerStarted","Data":"0d342becb8c9079d473b6570d7483dbf8aaa74af11271eb5d24e4c4e144d7896"} Dec 09 04:41:24 crc kubenswrapper[4766]: I1209 04:41:24.512344 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 04:41:24 crc kubenswrapper[4766]: I1209 04:41:24.535689 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_b078d080-be05-4292-a2f4-a681effddc06/mariadb-client/0.log" Dec 09 04:41:24 crc kubenswrapper[4766]: I1209 04:41:24.570008 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:24 crc kubenswrapper[4766]: I1209 04:41:24.577378 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 09 04:41:24 crc kubenswrapper[4766]: I1209 04:41:24.655778 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8cb4\" (UniqueName: \"kubernetes.io/projected/b078d080-be05-4292-a2f4-a681effddc06-kube-api-access-g8cb4\") pod \"b078d080-be05-4292-a2f4-a681effddc06\" (UID: \"b078d080-be05-4292-a2f4-a681effddc06\") " Dec 09 04:41:24 crc kubenswrapper[4766]: I1209 04:41:24.666572 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b078d080-be05-4292-a2f4-a681effddc06-kube-api-access-g8cb4" (OuterVolumeSpecName: "kube-api-access-g8cb4") pod "b078d080-be05-4292-a2f4-a681effddc06" (UID: "b078d080-be05-4292-a2f4-a681effddc06"). InnerVolumeSpecName "kube-api-access-g8cb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:41:24 crc kubenswrapper[4766]: I1209 04:41:24.757447 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8cb4\" (UniqueName: \"kubernetes.io/projected/b078d080-be05-4292-a2f4-a681effddc06-kube-api-access-g8cb4\") on node \"crc\" DevicePath \"\"" Dec 09 04:41:24 crc kubenswrapper[4766]: I1209 04:41:24.851104 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b078d080-be05-4292-a2f4-a681effddc06" path="/var/lib/kubelet/pods/b078d080-be05-4292-a2f4-a681effddc06/volumes" Dec 09 04:41:25 crc kubenswrapper[4766]: I1209 04:41:25.143638 4766 scope.go:117] "RemoveContainer" containerID="62fef8acebfe22e99f9cd04a6c48e0247f77e8ba818059220a70f58b9934bb84" Dec 09 04:41:25 crc kubenswrapper[4766]: I1209 04:41:25.143715 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.447712 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 04:41:56 crc kubenswrapper[4766]: E1209 04:41:56.448728 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b078d080-be05-4292-a2f4-a681effddc06" containerName="mariadb-client" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.448748 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b078d080-be05-4292-a2f4-a681effddc06" containerName="mariadb-client" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.448989 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b078d080-be05-4292-a2f4-a681effddc06" containerName="mariadb-client" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.450144 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.453467 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-797p7" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.453761 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.456296 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.466854 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.469399 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.482963 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.484708 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.495044 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.510602 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.536271 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566269 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85d69681-4b38-42ca-9e60-93fb02d90e75-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566350 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566379 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/85d69681-4b38-42ca-9e60-93fb02d90e75-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566497 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85d69681-4b38-42ca-9e60-93fb02d90e75-config\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566547 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23400620-2f42-40c5-8292-17c64ab567ea-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566675 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-630d1665-e5ab-4490-9770-467b18b9f77f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-630d1665-e5ab-4490-9770-467b18b9f77f\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566723 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0f2d3f1d-aa35-4e94-bbe4-bfcb6b89e29e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f2d3f1d-aa35-4e94-bbe4-bfcb6b89e29e\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566789 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzdzb\" (UniqueName: \"kubernetes.io/projected/85d69681-4b38-42ca-9e60-93fb02d90e75-kube-api-access-fzdzb\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566835 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23400620-2f42-40c5-8292-17c64ab567ea-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566886 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566932 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-config\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.566960 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23400620-2f42-40c5-8292-17c64ab567ea-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.567017 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d69681-4b38-42ca-9e60-93fb02d90e75-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.567048 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c277f33a-323c-4163-a1e6-b051fab2f129\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c277f33a-323c-4163-a1e6-b051fab2f129\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.567074 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlxqg\" (UniqueName: \"kubernetes.io/projected/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-kube-api-access-mlxqg\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.567107 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23400620-2f42-40c5-8292-17c64ab567ea-config\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.567140 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w6r4\" (UniqueName: \"kubernetes.io/projected/23400620-2f42-40c5-8292-17c64ab567ea-kube-api-access-4w6r4\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.567165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.659170 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.660786 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.663631 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.664021 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.664032 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-k8kdv" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668396 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c277f33a-323c-4163-a1e6-b051fab2f129\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c277f33a-323c-4163-a1e6-b051fab2f129\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668440 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlxqg\" (UniqueName: \"kubernetes.io/projected/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-kube-api-access-mlxqg\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668467 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23400620-2f42-40c5-8292-17c64ab567ea-config\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668491 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w6r4\" (UniqueName: \"kubernetes.io/projected/23400620-2f42-40c5-8292-17c64ab567ea-kube-api-access-4w6r4\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668524 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668588 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85d69681-4b38-42ca-9e60-93fb02d90e75-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668623 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668653 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/85d69681-4b38-42ca-9e60-93fb02d90e75-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668694 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85d69681-4b38-42ca-9e60-93fb02d90e75-config\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668716 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23400620-2f42-40c5-8292-17c64ab567ea-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668758 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-630d1665-e5ab-4490-9770-467b18b9f77f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-630d1665-e5ab-4490-9770-467b18b9f77f\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668783 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0f2d3f1d-aa35-4e94-bbe4-bfcb6b89e29e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f2d3f1d-aa35-4e94-bbe4-bfcb6b89e29e\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668819 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzdzb\" (UniqueName: \"kubernetes.io/projected/85d69681-4b38-42ca-9e60-93fb02d90e75-kube-api-access-fzdzb\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668844 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23400620-2f42-40c5-8292-17c64ab567ea-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668879 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668911 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-config\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668938 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23400620-2f42-40c5-8292-17c64ab567ea-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.668983 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d69681-4b38-42ca-9e60-93fb02d90e75-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.669012 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.670483 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.670500 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85d69681-4b38-42ca-9e60-93fb02d90e75-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.670780 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.671328 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/85d69681-4b38-42ca-9e60-93fb02d90e75-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.672017 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23400620-2f42-40c5-8292-17c64ab567ea-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.673467 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23400620-2f42-40c5-8292-17c64ab567ea-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.675731 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23400620-2f42-40c5-8292-17c64ab567ea-config\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.676227 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.676470 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.676518 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-630d1665-e5ab-4490-9770-467b18b9f77f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-630d1665-e5ab-4490-9770-467b18b9f77f\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fc419b22870a994e63847802b8788a729b2f553f0146a9501cb7657f6608ffc6/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.676604 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.676636 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c277f33a-323c-4163-a1e6-b051fab2f129\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c277f33a-323c-4163-a1e6-b051fab2f129\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ad30052c868b4a36aba99bce4bcc7e4be80aab1415dcb336768a5b5e4d9a6136/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.677013 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.677042 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0f2d3f1d-aa35-4e94-bbe4-bfcb6b89e29e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f2d3f1d-aa35-4e94-bbe4-bfcb6b89e29e\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4181ca65e66f569220e6dc6108c4abe7d5c3f157aac10b68ebebce23aeec5c87/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.682436 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d69681-4b38-42ca-9e60-93fb02d90e75-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.690478 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85d69681-4b38-42ca-9e60-93fb02d90e75-config\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.692724 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23400620-2f42-40c5-8292-17c64ab567ea-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.693748 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-config\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.697425 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzdzb\" (UniqueName: \"kubernetes.io/projected/85d69681-4b38-42ca-9e60-93fb02d90e75-kube-api-access-fzdzb\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.698080 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w6r4\" (UniqueName: \"kubernetes.io/projected/23400620-2f42-40c5-8292-17c64ab567ea-kube-api-access-4w6r4\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.699293 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.702441 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlxqg\" (UniqueName: \"kubernetes.io/projected/be8365e2-d3eb-4562-a24a-6dfb4342cfd5-kube-api-access-mlxqg\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.709939 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.718253 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.753971 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-630d1665-e5ab-4490-9770-467b18b9f77f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-630d1665-e5ab-4490-9770-467b18b9f77f\") pod \"ovsdbserver-nb-1\" (UID: \"be8365e2-d3eb-4562-a24a-6dfb4342cfd5\") " pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.756319 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.761106 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0f2d3f1d-aa35-4e94-bbe4-bfcb6b89e29e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0f2d3f1d-aa35-4e94-bbe4-bfcb6b89e29e\") pod \"ovsdbserver-nb-0\" (UID: \"23400620-2f42-40c5-8292-17c64ab567ea\") " pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770343 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-config\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770388 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pjpt\" (UniqueName: \"kubernetes.io/projected/259ef0b7-117e-4842-8af1-2038ecd40d47-kube-api-access-7pjpt\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770438 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770441 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770495 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770531 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-config\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770573 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770600 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259ef0b7-117e-4842-8af1-2038ecd40d47-config\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-df84a50f-ea61-4cc2-bade-e073a7113a15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df84a50f-ea61-4cc2-bade-e073a7113a15\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770693 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770792 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770816 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ffd19cdf-014b-4fd1-9f99-1ef4d930eec4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ffd19cdf-014b-4fd1-9f99-1ef4d930eec4\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770841 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2bbm\" (UniqueName: \"kubernetes.io/projected/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-kube-api-access-v2bbm\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770878 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770914 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/259ef0b7-117e-4842-8af1-2038ecd40d47-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770936 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fab9e811-2016-448e-aee9-13407b1fc11a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fab9e811-2016-448e-aee9-13407b1fc11a\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770955 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259ef0b7-117e-4842-8af1-2038ecd40d47-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770971 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/259ef0b7-117e-4842-8af1-2038ecd40d47-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.770999 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpw8g\" (UniqueName: \"kubernetes.io/projected/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-kube-api-access-hpw8g\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.773997 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c277f33a-323c-4163-a1e6-b051fab2f129\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c277f33a-323c-4163-a1e6-b051fab2f129\") pod \"ovsdbserver-nb-2\" (UID: \"85d69681-4b38-42ca-9e60-93fb02d90e75\") " pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.777611 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.817819 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.837521 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.845597 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.871983 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.874716 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.875417 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.876065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.876163 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-config\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.876262 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.876311 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259ef0b7-117e-4842-8af1-2038ecd40d47-config\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.878069 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259ef0b7-117e-4842-8af1-2038ecd40d47-config\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.878511 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.879281 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-df84a50f-ea61-4cc2-bade-e073a7113a15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df84a50f-ea61-4cc2-bade-e073a7113a15\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.879347 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.881236 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.882360 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ffd19cdf-014b-4fd1-9f99-1ef4d930eec4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ffd19cdf-014b-4fd1-9f99-1ef4d930eec4\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.882440 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2bbm\" (UniqueName: \"kubernetes.io/projected/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-kube-api-access-v2bbm\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.882537 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.882633 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/259ef0b7-117e-4842-8af1-2038ecd40d47-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.882682 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fab9e811-2016-448e-aee9-13407b1fc11a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fab9e811-2016-448e-aee9-13407b1fc11a\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.882717 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259ef0b7-117e-4842-8af1-2038ecd40d47-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.882756 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/259ef0b7-117e-4842-8af1-2038ecd40d47-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.882840 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpw8g\" (UniqueName: \"kubernetes.io/projected/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-kube-api-access-hpw8g\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.883206 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-config\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.883300 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pjpt\" (UniqueName: \"kubernetes.io/projected/259ef0b7-117e-4842-8af1-2038ecd40d47-kube-api-access-7pjpt\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.883509 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.883560 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-df84a50f-ea61-4cc2-bade-e073a7113a15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df84a50f-ea61-4cc2-bade-e073a7113a15\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d47caf27da6443e9d63b24faa8e8d7b5b3b05a917a9dd22594bb7e4e10b671dc/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.885705 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/259ef0b7-117e-4842-8af1-2038ecd40d47-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.885944 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-config\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.887799 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.887977 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.888026 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ffd19cdf-014b-4fd1-9f99-1ef4d930eec4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ffd19cdf-014b-4fd1-9f99-1ef4d930eec4\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/943aa8cb041474fd3006788b3efe9d2fb7021f7c4a1c3651e1da7443d42169ce/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.889087 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.889431 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.890188 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/259ef0b7-117e-4842-8af1-2038ecd40d47-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.892834 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.892872 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fab9e811-2016-448e-aee9-13407b1fc11a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fab9e811-2016-448e-aee9-13407b1fc11a\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7303878aa3941045cf2ad7ee341c5216f35862e166076ad9a35f5eebd8b0296f/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.894236 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-config\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.894534 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/259ef0b7-117e-4842-8af1-2038ecd40d47-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.904133 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpw8g\" (UniqueName: \"kubernetes.io/projected/3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6-kube-api-access-hpw8g\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.908530 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pjpt\" (UniqueName: \"kubernetes.io/projected/259ef0b7-117e-4842-8af1-2038ecd40d47-kube-api-access-7pjpt\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.922135 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2bbm\" (UniqueName: \"kubernetes.io/projected/d16ac2e2-75aa-4533-8fc6-4e0e8e006b14-kube-api-access-v2bbm\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.936685 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-df84a50f-ea61-4cc2-bade-e073a7113a15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-df84a50f-ea61-4cc2-bade-e073a7113a15\") pod \"ovsdbserver-sb-1\" (UID: \"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6\") " pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.946539 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fab9e811-2016-448e-aee9-13407b1fc11a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fab9e811-2016-448e-aee9-13407b1fc11a\") pod \"ovsdbserver-sb-0\" (UID: \"259ef0b7-117e-4842-8af1-2038ecd40d47\") " pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:56 crc kubenswrapper[4766]: I1209 04:41:56.983913 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ffd19cdf-014b-4fd1-9f99-1ef4d930eec4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ffd19cdf-014b-4fd1-9f99-1ef4d930eec4\") pod \"ovsdbserver-sb-2\" (UID: \"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14\") " pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:57 crc kubenswrapper[4766]: I1209 04:41:57.070795 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 09 04:41:57 crc kubenswrapper[4766]: I1209 04:41:57.081414 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 09 04:41:57 crc kubenswrapper[4766]: I1209 04:41:57.087478 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 09 04:41:57 crc kubenswrapper[4766]: I1209 04:41:57.373438 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 09 04:41:57 crc kubenswrapper[4766]: I1209 04:41:57.472060 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 09 04:41:57 crc kubenswrapper[4766]: W1209 04:41:57.475451 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod259ef0b7_117e_4842_8af1_2038ecd40d47.slice/crio-aeead5824bed88f339210b42a8d858faf3fa35f31f1f6f582e6b3ac036a1b831 WatchSource:0}: Error finding container aeead5824bed88f339210b42a8d858faf3fa35f31f1f6f582e6b3ac036a1b831: Status 404 returned error can't find the container with id aeead5824bed88f339210b42a8d858faf3fa35f31f1f6f582e6b3ac036a1b831 Dec 09 04:41:57 crc kubenswrapper[4766]: I1209 04:41:57.727871 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 09 04:41:57 crc kubenswrapper[4766]: W1209 04:41:57.736396 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd16ac2e2_75aa_4533_8fc6_4e0e8e006b14.slice/crio-3f1fc19237eb41567a3d11a136a23bff774c24b4b90b2128c387ec3be0c0a5e6 WatchSource:0}: Error finding container 3f1fc19237eb41567a3d11a136a23bff774c24b4b90b2128c387ec3be0c0a5e6: Status 404 returned error can't find the container with id 3f1fc19237eb41567a3d11a136a23bff774c24b4b90b2128c387ec3be0c0a5e6 Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.295924 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 09 04:41:58 crc kubenswrapper[4766]: W1209 04:41:58.298987 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe8365e2_d3eb_4562_a24a_6dfb4342cfd5.slice/crio-605dedac296850a9c5a0f2a9ff4353488d0ebe808c1614c5ed0f3b07d1c0ba82 WatchSource:0}: Error finding container 605dedac296850a9c5a0f2a9ff4353488d0ebe808c1614c5ed0f3b07d1c0ba82: Status 404 returned error can't find the container with id 605dedac296850a9c5a0f2a9ff4353488d0ebe808c1614c5ed0f3b07d1c0ba82 Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.335190 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"be8365e2-d3eb-4562-a24a-6dfb4342cfd5","Type":"ContainerStarted","Data":"605dedac296850a9c5a0f2a9ff4353488d0ebe808c1614c5ed0f3b07d1c0ba82"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.340712 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"23400620-2f42-40c5-8292-17c64ab567ea","Type":"ContainerStarted","Data":"dc1db52e948c4b1f47803ad3c26a02c5558d9a0da90b4193f499c64133725557"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.340765 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"23400620-2f42-40c5-8292-17c64ab567ea","Type":"ContainerStarted","Data":"81e48f4a4d153bf63cff8cba261345f8d72ad805988f7d11354ed667eadf4376"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.340776 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"23400620-2f42-40c5-8292-17c64ab567ea","Type":"ContainerStarted","Data":"69dbc4f5acdc53fda183cb7839f956addd8fa8d03f46ca87961026c7f1082090"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.342824 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"259ef0b7-117e-4842-8af1-2038ecd40d47","Type":"ContainerStarted","Data":"20aaa54010e3093296275e726295a78f64030c5342e551c7c1d9b086a53c313c"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.342856 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"259ef0b7-117e-4842-8af1-2038ecd40d47","Type":"ContainerStarted","Data":"22acbbab4d50b8d5fc616104becb50c5923919c88cf102d5862ff9198efd4a4c"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.342866 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"259ef0b7-117e-4842-8af1-2038ecd40d47","Type":"ContainerStarted","Data":"aeead5824bed88f339210b42a8d858faf3fa35f31f1f6f582e6b3ac036a1b831"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.344780 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14","Type":"ContainerStarted","Data":"3d62fd8b2fbdef7106d4d52f499a8330e843ec3132b938d5140bb0d697abce4a"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.344802 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14","Type":"ContainerStarted","Data":"e217584db1b92e92b319925e71cf5250fb0bf5029257df6053f797c3cf623642"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.344811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d16ac2e2-75aa-4533-8fc6-4e0e8e006b14","Type":"ContainerStarted","Data":"3f1fc19237eb41567a3d11a136a23bff774c24b4b90b2128c387ec3be0c0a5e6"} Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.364954 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.364927738 podStartE2EDuration="3.364927738s" podCreationTimestamp="2025-12-09 04:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:41:58.357074625 +0000 UTC m=+5400.066380061" watchObservedRunningTime="2025-12-09 04:41:58.364927738 +0000 UTC m=+5400.074233194" Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.389092 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.38907059 podStartE2EDuration="3.38907059s" podCreationTimestamp="2025-12-09 04:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:41:58.376912212 +0000 UTC m=+5400.086217698" watchObservedRunningTime="2025-12-09 04:41:58.38907059 +0000 UTC m=+5400.098376016" Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.400748 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.400723315 podStartE2EDuration="3.400723315s" podCreationTimestamp="2025-12-09 04:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:41:58.391384383 +0000 UTC m=+5400.100689859" watchObservedRunningTime="2025-12-09 04:41:58.400723315 +0000 UTC m=+5400.110028751" Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.429433 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 09 04:41:58 crc kubenswrapper[4766]: W1209 04:41:58.429473 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85d69681_4b38_42ca_9e60_93fb02d90e75.slice/crio-09423cb04d78e920ef87e33102d91c433b584fc61c46cd005d21be1590673d1d WatchSource:0}: Error finding container 09423cb04d78e920ef87e33102d91c433b584fc61c46cd005d21be1590673d1d: Status 404 returned error can't find the container with id 09423cb04d78e920ef87e33102d91c433b584fc61c46cd005d21be1590673d1d Dec 09 04:41:58 crc kubenswrapper[4766]: I1209 04:41:58.574271 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 09 04:41:58 crc kubenswrapper[4766]: W1209 04:41:58.584829 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d9fdceb_c59e_4d47_8dc1_0d6bdcb748d6.slice/crio-f209d51de6cdcdfd8bc2470fac709568266f1ae7fd2a54fa384cc7548743e8f6 WatchSource:0}: Error finding container f209d51de6cdcdfd8bc2470fac709568266f1ae7fd2a54fa384cc7548743e8f6: Status 404 returned error can't find the container with id f209d51de6cdcdfd8bc2470fac709568266f1ae7fd2a54fa384cc7548743e8f6 Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.358763 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6","Type":"ContainerStarted","Data":"0356c7131e7d0725baee7a6ea25297739371f3f02d621b81f52f036f77f9806d"} Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.359149 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6","Type":"ContainerStarted","Data":"d8159b696259aa76baea1a9bd261186c262b10b9e2858c089f2adb80ba56d89d"} Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.359186 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6","Type":"ContainerStarted","Data":"f209d51de6cdcdfd8bc2470fac709568266f1ae7fd2a54fa384cc7548743e8f6"} Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.363420 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"be8365e2-d3eb-4562-a24a-6dfb4342cfd5","Type":"ContainerStarted","Data":"1b5553abc16693a1d650dc621641d72cd5e1a848be8b19adf2ce85a5b22b486e"} Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.363705 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"be8365e2-d3eb-4562-a24a-6dfb4342cfd5","Type":"ContainerStarted","Data":"86ec5dee5ce7c929fcf71fb07ef83780d36dd9100c34181c206cc6f95f731984"} Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.366854 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"85d69681-4b38-42ca-9e60-93fb02d90e75","Type":"ContainerStarted","Data":"e5568fe858b6099004ebc93a25185402646cddbe39fea65e69ca9702dfc43e2c"} Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.366912 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"85d69681-4b38-42ca-9e60-93fb02d90e75","Type":"ContainerStarted","Data":"581284f3e32440d4f8119d8c558bb9c79c9698f1e7ba19c559a977af295ad60c"} Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.366940 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"85d69681-4b38-42ca-9e60-93fb02d90e75","Type":"ContainerStarted","Data":"09423cb04d78e920ef87e33102d91c433b584fc61c46cd005d21be1590673d1d"} Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.390515 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.390481124 podStartE2EDuration="4.390481124s" podCreationTimestamp="2025-12-09 04:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:41:59.389355963 +0000 UTC m=+5401.098661419" watchObservedRunningTime="2025-12-09 04:41:59.390481124 +0000 UTC m=+5401.099786590" Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.427311 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.427282618 podStartE2EDuration="4.427282618s" podCreationTimestamp="2025-12-09 04:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:41:59.413956158 +0000 UTC m=+5401.123261624" watchObservedRunningTime="2025-12-09 04:41:59.427282618 +0000 UTC m=+5401.136588084" Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.446877 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.446852458 podStartE2EDuration="4.446852458s" podCreationTimestamp="2025-12-09 04:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:41:59.44288553 +0000 UTC m=+5401.152190976" watchObservedRunningTime="2025-12-09 04:41:59.446852458 +0000 UTC m=+5401.156157924" Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.818456 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.838497 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 09 04:41:59 crc kubenswrapper[4766]: I1209 04:41:59.846189 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 09 04:42:00 crc kubenswrapper[4766]: I1209 04:42:00.072286 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 09 04:42:00 crc kubenswrapper[4766]: I1209 04:42:00.082323 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 09 04:42:00 crc kubenswrapper[4766]: I1209 04:42:00.088346 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 09 04:42:00 crc kubenswrapper[4766]: I1209 04:42:00.140513 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 09 04:42:00 crc kubenswrapper[4766]: I1209 04:42:00.141092 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 09 04:42:00 crc kubenswrapper[4766]: I1209 04:42:00.377325 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 09 04:42:00 crc kubenswrapper[4766]: I1209 04:42:00.377420 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 09 04:42:01 crc kubenswrapper[4766]: I1209 04:42:01.818658 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 09 04:42:01 crc kubenswrapper[4766]: I1209 04:42:01.839096 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 09 04:42:01 crc kubenswrapper[4766]: I1209 04:42:01.846299 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.087633 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.108693 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.120451 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.297129 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b65bd4d69-mf49n"] Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.298681 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.302021 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.311172 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b65bd4d69-mf49n"] Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.378017 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r5w7\" (UniqueName: \"kubernetes.io/projected/e7734e2b-d6d0-4e81-a501-2fe7d2308701-kube-api-access-8r5w7\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.378137 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-dns-svc\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.378175 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-config\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.378225 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-ovsdbserver-sb\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.480139 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-dns-svc\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.480265 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-config\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.480333 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-ovsdbserver-sb\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.480583 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r5w7\" (UniqueName: \"kubernetes.io/projected/e7734e2b-d6d0-4e81-a501-2fe7d2308701-kube-api-access-8r5w7\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.481017 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-dns-svc\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.481123 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-config\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.481861 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-ovsdbserver-sb\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.498984 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r5w7\" (UniqueName: \"kubernetes.io/projected/e7734e2b-d6d0-4e81-a501-2fe7d2308701-kube-api-access-8r5w7\") pod \"dnsmasq-dns-7b65bd4d69-mf49n\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.624523 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.862546 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.892953 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.896753 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 09 04:42:02 crc kubenswrapper[4766]: I1209 04:42:02.925847 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.075197 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b65bd4d69-mf49n"] Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.179111 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.244876 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b65bd4d69-mf49n"] Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.256145 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57866479cc-ggtfs"] Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.257503 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.259016 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.266845 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57866479cc-ggtfs"] Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.392338 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf5sw\" (UniqueName: \"kubernetes.io/projected/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-kube-api-access-rf5sw\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.392412 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-config\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.392527 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-nb\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.392771 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-dns-svc\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.392848 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-sb\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.408177 4766 generic.go:334] "Generic (PLEG): container finished" podID="e7734e2b-d6d0-4e81-a501-2fe7d2308701" containerID="376b1a9aa8624a95fcd1e448ecac16238903defb4ee1f9a32f74f020ca367b0b" exitCode=0 Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.408445 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" event={"ID":"e7734e2b-d6d0-4e81-a501-2fe7d2308701","Type":"ContainerDied","Data":"376b1a9aa8624a95fcd1e448ecac16238903defb4ee1f9a32f74f020ca367b0b"} Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.408502 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" event={"ID":"e7734e2b-d6d0-4e81-a501-2fe7d2308701","Type":"ContainerStarted","Data":"d40bd555c31b725520b6b9355571a9b5af6e89c9f00bf380356f69370524f66f"} Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.455699 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.462798 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.466418 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.495140 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-dns-svc\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.495192 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-sb\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.495244 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf5sw\" (UniqueName: \"kubernetes.io/projected/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-kube-api-access-rf5sw\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.495284 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-config\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.495595 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-nb\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.496165 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-config\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.496190 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-sb\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.496301 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-nb\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.496804 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-dns-svc\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.527256 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf5sw\" (UniqueName: \"kubernetes.io/projected/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-kube-api-access-rf5sw\") pod \"dnsmasq-dns-57866479cc-ggtfs\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.616842 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.747938 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.800386 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-dns-svc\") pod \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.800513 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r5w7\" (UniqueName: \"kubernetes.io/projected/e7734e2b-d6d0-4e81-a501-2fe7d2308701-kube-api-access-8r5w7\") pod \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.800556 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-ovsdbserver-sb\") pod \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.800584 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-config\") pod \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\" (UID: \"e7734e2b-d6d0-4e81-a501-2fe7d2308701\") " Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.804577 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7734e2b-d6d0-4e81-a501-2fe7d2308701-kube-api-access-8r5w7" (OuterVolumeSpecName: "kube-api-access-8r5w7") pod "e7734e2b-d6d0-4e81-a501-2fe7d2308701" (UID: "e7734e2b-d6d0-4e81-a501-2fe7d2308701"). InnerVolumeSpecName "kube-api-access-8r5w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.828330 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7734e2b-d6d0-4e81-a501-2fe7d2308701" (UID: "e7734e2b-d6d0-4e81-a501-2fe7d2308701"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.844925 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-config" (OuterVolumeSpecName: "config") pod "e7734e2b-d6d0-4e81-a501-2fe7d2308701" (UID: "e7734e2b-d6d0-4e81-a501-2fe7d2308701"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.856103 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7734e2b-d6d0-4e81-a501-2fe7d2308701" (UID: "e7734e2b-d6d0-4e81-a501-2fe7d2308701"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.902169 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.902199 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r5w7\" (UniqueName: \"kubernetes.io/projected/e7734e2b-d6d0-4e81-a501-2fe7d2308701-kube-api-access-8r5w7\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.902226 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:03 crc kubenswrapper[4766]: I1209 04:42:03.902235 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7734e2b-d6d0-4e81-a501-2fe7d2308701-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.137038 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57866479cc-ggtfs"] Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.418811 4766 generic.go:334] "Generic (PLEG): container finished" podID="ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" containerID="152012238b5fe439699245bd8b5544421a1ef1a4a6a4024864ebc17bec546d19" exitCode=0 Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.418867 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" event={"ID":"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70","Type":"ContainerDied","Data":"152012238b5fe439699245bd8b5544421a1ef1a4a6a4024864ebc17bec546d19"} Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.419098 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" event={"ID":"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70","Type":"ContainerStarted","Data":"7f7557e85ea5673c6647922f00369171fb2fb8a1cd4fe054ad83d165908f9865"} Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.423602 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.423620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b65bd4d69-mf49n" event={"ID":"e7734e2b-d6d0-4e81-a501-2fe7d2308701","Type":"ContainerDied","Data":"d40bd555c31b725520b6b9355571a9b5af6e89c9f00bf380356f69370524f66f"} Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.423686 4766 scope.go:117] "RemoveContainer" containerID="376b1a9aa8624a95fcd1e448ecac16238903defb4ee1f9a32f74f020ca367b0b" Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.520979 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b65bd4d69-mf49n"] Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.530490 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b65bd4d69-mf49n"] Dec 09 04:42:04 crc kubenswrapper[4766]: I1209 04:42:04.856706 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7734e2b-d6d0-4e81-a501-2fe7d2308701" path="/var/lib/kubelet/pods/e7734e2b-d6d0-4e81-a501-2fe7d2308701/volumes" Dec 09 04:42:05 crc kubenswrapper[4766]: I1209 04:42:05.439268 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" event={"ID":"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70","Type":"ContainerStarted","Data":"f5483d93e746ddd82f7d5f1f5469013f4385b0294c5e26426a5f8f9e42e61abe"} Dec 09 04:42:05 crc kubenswrapper[4766]: I1209 04:42:05.439525 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:05 crc kubenswrapper[4766]: I1209 04:42:05.461307 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" podStartSLOduration=2.461287887 podStartE2EDuration="2.461287887s" podCreationTimestamp="2025-12-09 04:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:42:05.453774574 +0000 UTC m=+5407.163080010" watchObservedRunningTime="2025-12-09 04:42:05.461287887 +0000 UTC m=+5407.170593313" Dec 09 04:42:06 crc kubenswrapper[4766]: I1209 04:42:06.852200 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 09 04:42:06 crc kubenswrapper[4766]: E1209 04:42:06.852825 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7734e2b-d6d0-4e81-a501-2fe7d2308701" containerName="init" Dec 09 04:42:06 crc kubenswrapper[4766]: I1209 04:42:06.852841 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7734e2b-d6d0-4e81-a501-2fe7d2308701" containerName="init" Dec 09 04:42:06 crc kubenswrapper[4766]: I1209 04:42:06.853055 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7734e2b-d6d0-4e81-a501-2fe7d2308701" containerName="init" Dec 09 04:42:06 crc kubenswrapper[4766]: I1209 04:42:06.853790 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 09 04:42:06 crc kubenswrapper[4766]: I1209 04:42:06.860716 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 04:42:06 crc kubenswrapper[4766]: I1209 04:42:06.861520 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 09 04:42:06 crc kubenswrapper[4766]: I1209 04:42:06.950583 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4be9a001-60db-4cc6-a436-433efee77f20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4be9a001-60db-4cc6-a436-433efee77f20\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " pod="openstack/ovn-copy-data" Dec 09 04:42:06 crc kubenswrapper[4766]: I1209 04:42:06.950667 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " pod="openstack/ovn-copy-data" Dec 09 04:42:06 crc kubenswrapper[4766]: I1209 04:42:06.950690 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l598\" (UniqueName: \"kubernetes.io/projected/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-kube-api-access-8l598\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " pod="openstack/ovn-copy-data" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.052011 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4be9a001-60db-4cc6-a436-433efee77f20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4be9a001-60db-4cc6-a436-433efee77f20\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " pod="openstack/ovn-copy-data" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.052125 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " pod="openstack/ovn-copy-data" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.052154 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l598\" (UniqueName: \"kubernetes.io/projected/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-kube-api-access-8l598\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " pod="openstack/ovn-copy-data" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.054987 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.055036 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4be9a001-60db-4cc6-a436-433efee77f20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4be9a001-60db-4cc6-a436-433efee77f20\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3cf1101ba3275643135dc1c1e57721f34e63bdf229a5ceff99cfaa1c7e8d8642/globalmount\"" pod="openstack/ovn-copy-data" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.059080 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " pod="openstack/ovn-copy-data" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.080436 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l598\" (UniqueName: \"kubernetes.io/projected/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-kube-api-access-8l598\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " pod="openstack/ovn-copy-data" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.101283 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4be9a001-60db-4cc6-a436-433efee77f20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4be9a001-60db-4cc6-a436-433efee77f20\") pod \"ovn-copy-data\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " pod="openstack/ovn-copy-data" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.179784 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.316157 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.316537 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:42:07 crc kubenswrapper[4766]: I1209 04:42:07.721156 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 04:42:07 crc kubenswrapper[4766]: W1209 04:42:07.733740 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b86ac1c_358a_4c50_9d62_f39f667a1d6d.slice/crio-87152dc48e1c1e89eca0b8c5cf9c634826066efc61c47ccf632b1ea6861fb1b6 WatchSource:0}: Error finding container 87152dc48e1c1e89eca0b8c5cf9c634826066efc61c47ccf632b1ea6861fb1b6: Status 404 returned error can't find the container with id 87152dc48e1c1e89eca0b8c5cf9c634826066efc61c47ccf632b1ea6861fb1b6 Dec 09 04:42:08 crc kubenswrapper[4766]: I1209 04:42:08.469001 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"2b86ac1c-358a-4c50-9d62-f39f667a1d6d","Type":"ContainerStarted","Data":"1b7ab55a4c6f10c9e96036b4d7e4aec07a6a06709402a67aee4da01ca36b5fe3"} Dec 09 04:42:08 crc kubenswrapper[4766]: I1209 04:42:08.469395 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"2b86ac1c-358a-4c50-9d62-f39f667a1d6d","Type":"ContainerStarted","Data":"87152dc48e1c1e89eca0b8c5cf9c634826066efc61c47ccf632b1ea6861fb1b6"} Dec 09 04:42:08 crc kubenswrapper[4766]: I1209 04:42:08.489354 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.489336421 podStartE2EDuration="3.489336421s" podCreationTimestamp="2025-12-09 04:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:42:08.486817672 +0000 UTC m=+5410.196123138" watchObservedRunningTime="2025-12-09 04:42:08.489336421 +0000 UTC m=+5410.198641847" Dec 09 04:42:13 crc kubenswrapper[4766]: I1209 04:42:13.619500 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:13 crc kubenswrapper[4766]: I1209 04:42:13.697258 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-b69z7"] Dec 09 04:42:13 crc kubenswrapper[4766]: I1209 04:42:13.697633 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" podUID="1f0168a3-945b-49f7-9e2f-8ec5b7a66233" containerName="dnsmasq-dns" containerID="cri-o://8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e" gracePeriod=10 Dec 09 04:42:13 crc kubenswrapper[4766]: I1209 04:42:13.819575 4766 scope.go:117] "RemoveContainer" containerID="6f4237376e57135236fdb5193512b4d01f15a4ad09623d20ad3b43ca377c30ef" Dec 09 04:42:13 crc kubenswrapper[4766]: I1209 04:42:13.868460 4766 scope.go:117] "RemoveContainer" containerID="71dcca40cb34a1b59bf74581df36020017cb2f16db3ba2a071dbd92f9b3650a1" Dec 09 04:42:13 crc kubenswrapper[4766]: I1209 04:42:13.892804 4766 scope.go:117] "RemoveContainer" containerID="ba9fc45a426a1f3d0d8051b462d4187d6027d6ca366c47d41974ed4daac3040a" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.146531 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.308501 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-config\") pod \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.308666 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkj7k\" (UniqueName: \"kubernetes.io/projected/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-kube-api-access-tkj7k\") pod \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.308699 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-dns-svc\") pod \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\" (UID: \"1f0168a3-945b-49f7-9e2f-8ec5b7a66233\") " Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.313321 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-kube-api-access-tkj7k" (OuterVolumeSpecName: "kube-api-access-tkj7k") pod "1f0168a3-945b-49f7-9e2f-8ec5b7a66233" (UID: "1f0168a3-945b-49f7-9e2f-8ec5b7a66233"). InnerVolumeSpecName "kube-api-access-tkj7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.345449 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-config" (OuterVolumeSpecName: "config") pod "1f0168a3-945b-49f7-9e2f-8ec5b7a66233" (UID: "1f0168a3-945b-49f7-9e2f-8ec5b7a66233"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.362538 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f0168a3-945b-49f7-9e2f-8ec5b7a66233" (UID: "1f0168a3-945b-49f7-9e2f-8ec5b7a66233"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.410663 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkj7k\" (UniqueName: \"kubernetes.io/projected/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-kube-api-access-tkj7k\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.410709 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.410721 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f0168a3-945b-49f7-9e2f-8ec5b7a66233-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.525657 4766 generic.go:334] "Generic (PLEG): container finished" podID="1f0168a3-945b-49f7-9e2f-8ec5b7a66233" containerID="8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e" exitCode=0 Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.525722 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" event={"ID":"1f0168a3-945b-49f7-9e2f-8ec5b7a66233","Type":"ContainerDied","Data":"8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e"} Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.525772 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" event={"ID":"1f0168a3-945b-49f7-9e2f-8ec5b7a66233","Type":"ContainerDied","Data":"b68707598dddffa506655c27aa41a327ce5075f208811452a8974fa1d6920dbc"} Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.525777 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-b69z7" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.525799 4766 scope.go:117] "RemoveContainer" containerID="8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.586406 4766 scope.go:117] "RemoveContainer" containerID="2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.588117 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-b69z7"] Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.612273 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-b69z7"] Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.639394 4766 scope.go:117] "RemoveContainer" containerID="8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e" Dec 09 04:42:14 crc kubenswrapper[4766]: E1209 04:42:14.643344 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e\": container with ID starting with 8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e not found: ID does not exist" containerID="8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.643374 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e"} err="failed to get container status \"8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e\": rpc error: code = NotFound desc = could not find container \"8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e\": container with ID starting with 8115a6f1c5fd9318ce7fcaee73fd2830b15cdb0a411c680a947f6bf48af6e98e not found: ID does not exist" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.643395 4766 scope.go:117] "RemoveContainer" containerID="2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c" Dec 09 04:42:14 crc kubenswrapper[4766]: E1209 04:42:14.647324 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c\": container with ID starting with 2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c not found: ID does not exist" containerID="2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.647365 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c"} err="failed to get container status \"2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c\": rpc error: code = NotFound desc = could not find container \"2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c\": container with ID starting with 2d50699b209fd6b50f7a237f4f2f94a55d63e29bdeae2065dace6416036e3e9c not found: ID does not exist" Dec 09 04:42:14 crc kubenswrapper[4766]: I1209 04:42:14.848201 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0168a3-945b-49f7-9e2f-8ec5b7a66233" path="/var/lib/kubelet/pods/1f0168a3-945b-49f7-9e2f-8ec5b7a66233/volumes" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.050205 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 09 04:42:15 crc kubenswrapper[4766]: E1209 04:42:15.050540 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0168a3-945b-49f7-9e2f-8ec5b7a66233" containerName="init" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.050552 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0168a3-945b-49f7-9e2f-8ec5b7a66233" containerName="init" Dec 09 04:42:15 crc kubenswrapper[4766]: E1209 04:42:15.050569 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0168a3-945b-49f7-9e2f-8ec5b7a66233" containerName="dnsmasq-dns" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.050575 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0168a3-945b-49f7-9e2f-8ec5b7a66233" containerName="dnsmasq-dns" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.050707 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0168a3-945b-49f7-9e2f-8ec5b7a66233" containerName="dnsmasq-dns" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.051450 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.056480 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.056833 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jz2xw" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.064280 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.069838 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.226085 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b809cd-dee1-42e4-a46a-8c7f9c067678-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.226133 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghd4t\" (UniqueName: \"kubernetes.io/projected/71b809cd-dee1-42e4-a46a-8c7f9c067678-kube-api-access-ghd4t\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.226240 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71b809cd-dee1-42e4-a46a-8c7f9c067678-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.226280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71b809cd-dee1-42e4-a46a-8c7f9c067678-scripts\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.226311 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b809cd-dee1-42e4-a46a-8c7f9c067678-config\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.328051 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b809cd-dee1-42e4-a46a-8c7f9c067678-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.328343 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghd4t\" (UniqueName: \"kubernetes.io/projected/71b809cd-dee1-42e4-a46a-8c7f9c067678-kube-api-access-ghd4t\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.328518 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71b809cd-dee1-42e4-a46a-8c7f9c067678-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.328643 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71b809cd-dee1-42e4-a46a-8c7f9c067678-scripts\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.328777 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b809cd-dee1-42e4-a46a-8c7f9c067678-config\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.328929 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/71b809cd-dee1-42e4-a46a-8c7f9c067678-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.329434 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71b809cd-dee1-42e4-a46a-8c7f9c067678-scripts\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.329725 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b809cd-dee1-42e4-a46a-8c7f9c067678-config\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.332359 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b809cd-dee1-42e4-a46a-8c7f9c067678-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.348656 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghd4t\" (UniqueName: \"kubernetes.io/projected/71b809cd-dee1-42e4-a46a-8c7f9c067678-kube-api-access-ghd4t\") pod \"ovn-northd-0\" (UID: \"71b809cd-dee1-42e4-a46a-8c7f9c067678\") " pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.417731 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 09 04:42:15 crc kubenswrapper[4766]: I1209 04:42:15.852509 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 09 04:42:16 crc kubenswrapper[4766]: I1209 04:42:16.544548 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"71b809cd-dee1-42e4-a46a-8c7f9c067678","Type":"ContainerStarted","Data":"bb3415f031a71242de6ab446a63577e519272500d940f3f39df06d41c1b3460e"} Dec 09 04:42:16 crc kubenswrapper[4766]: I1209 04:42:16.544942 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 09 04:42:16 crc kubenswrapper[4766]: I1209 04:42:16.544958 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"71b809cd-dee1-42e4-a46a-8c7f9c067678","Type":"ContainerStarted","Data":"825f1f6900203d59f86001bbc1d19dc928b1cbd3b4382eeabe860b1f60e4d76b"} Dec 09 04:42:16 crc kubenswrapper[4766]: I1209 04:42:16.544972 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"71b809cd-dee1-42e4-a46a-8c7f9c067678","Type":"ContainerStarted","Data":"7d4ee752fa2e06ccff0f922630c8299894409fea135a9f6bf9911ed1a1ed9ef8"} Dec 09 04:42:16 crc kubenswrapper[4766]: I1209 04:42:16.571235 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.5711965129999999 podStartE2EDuration="1.571196513s" podCreationTimestamp="2025-12-09 04:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:42:16.563335301 +0000 UTC m=+5418.272640727" watchObservedRunningTime="2025-12-09 04:42:16.571196513 +0000 UTC m=+5418.280501939" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.081351 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-s5krb"] Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.083241 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.093554 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s5krb"] Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.129481 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6190f627-a269-4c61-8cd6-da76dae23866-operator-scripts\") pod \"keystone-db-create-s5krb\" (UID: \"6190f627-a269-4c61-8cd6-da76dae23866\") " pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.129551 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p76t\" (UniqueName: \"kubernetes.io/projected/6190f627-a269-4c61-8cd6-da76dae23866-kube-api-access-8p76t\") pod \"keystone-db-create-s5krb\" (UID: \"6190f627-a269-4c61-8cd6-da76dae23866\") " pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.178232 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-564a-account-create-update-92zz4"] Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.179147 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.181427 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.187517 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-564a-account-create-update-92zz4"] Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.230885 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6190f627-a269-4c61-8cd6-da76dae23866-operator-scripts\") pod \"keystone-db-create-s5krb\" (UID: \"6190f627-a269-4c61-8cd6-da76dae23866\") " pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.230950 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p76t\" (UniqueName: \"kubernetes.io/projected/6190f627-a269-4c61-8cd6-da76dae23866-kube-api-access-8p76t\") pod \"keystone-db-create-s5krb\" (UID: \"6190f627-a269-4c61-8cd6-da76dae23866\") " pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.230971 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb7l9\" (UniqueName: \"kubernetes.io/projected/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-kube-api-access-tb7l9\") pod \"keystone-564a-account-create-update-92zz4\" (UID: \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\") " pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.231008 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-operator-scripts\") pod \"keystone-564a-account-create-update-92zz4\" (UID: \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\") " pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.231644 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6190f627-a269-4c61-8cd6-da76dae23866-operator-scripts\") pod \"keystone-db-create-s5krb\" (UID: \"6190f627-a269-4c61-8cd6-da76dae23866\") " pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.251803 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p76t\" (UniqueName: \"kubernetes.io/projected/6190f627-a269-4c61-8cd6-da76dae23866-kube-api-access-8p76t\") pod \"keystone-db-create-s5krb\" (UID: \"6190f627-a269-4c61-8cd6-da76dae23866\") " pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.341777 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb7l9\" (UniqueName: \"kubernetes.io/projected/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-kube-api-access-tb7l9\") pod \"keystone-564a-account-create-update-92zz4\" (UID: \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\") " pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.341848 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-operator-scripts\") pod \"keystone-564a-account-create-update-92zz4\" (UID: \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\") " pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.342595 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-operator-scripts\") pod \"keystone-564a-account-create-update-92zz4\" (UID: \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\") " pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.364600 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb7l9\" (UniqueName: \"kubernetes.io/projected/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-kube-api-access-tb7l9\") pod \"keystone-564a-account-create-update-92zz4\" (UID: \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\") " pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.412415 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.498357 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.847879 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s5krb"] Dec 09 04:42:21 crc kubenswrapper[4766]: W1209 04:42:21.854572 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6190f627_a269_4c61_8cd6_da76dae23866.slice/crio-46419899640d462f9d81f378fc8cb1a9926b53d9226343ea8e39c0c3f864c0f3 WatchSource:0}: Error finding container 46419899640d462f9d81f378fc8cb1a9926b53d9226343ea8e39c0c3f864c0f3: Status 404 returned error can't find the container with id 46419899640d462f9d81f378fc8cb1a9926b53d9226343ea8e39c0c3f864c0f3 Dec 09 04:42:21 crc kubenswrapper[4766]: I1209 04:42:21.942292 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-564a-account-create-update-92zz4"] Dec 09 04:42:21 crc kubenswrapper[4766]: W1209 04:42:21.951283 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b1ebbf5_ae48_47ad_a185_ae686bcc2839.slice/crio-12bf988d4ad39993bccc05cf94376371d26a95623f0fafb1ff3cde827fafacbb WatchSource:0}: Error finding container 12bf988d4ad39993bccc05cf94376371d26a95623f0fafb1ff3cde827fafacbb: Status 404 returned error can't find the container with id 12bf988d4ad39993bccc05cf94376371d26a95623f0fafb1ff3cde827fafacbb Dec 09 04:42:22 crc kubenswrapper[4766]: I1209 04:42:22.607726 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b1ebbf5-ae48-47ad-a185-ae686bcc2839" containerID="1463142f4d27b9e80410c793f1fcd304c72c10e11410ad6b8705635932913395" exitCode=0 Dec 09 04:42:22 crc kubenswrapper[4766]: I1209 04:42:22.607804 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-564a-account-create-update-92zz4" event={"ID":"1b1ebbf5-ae48-47ad-a185-ae686bcc2839","Type":"ContainerDied","Data":"1463142f4d27b9e80410c793f1fcd304c72c10e11410ad6b8705635932913395"} Dec 09 04:42:22 crc kubenswrapper[4766]: I1209 04:42:22.608192 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-564a-account-create-update-92zz4" event={"ID":"1b1ebbf5-ae48-47ad-a185-ae686bcc2839","Type":"ContainerStarted","Data":"12bf988d4ad39993bccc05cf94376371d26a95623f0fafb1ff3cde827fafacbb"} Dec 09 04:42:22 crc kubenswrapper[4766]: I1209 04:42:22.614184 4766 generic.go:334] "Generic (PLEG): container finished" podID="6190f627-a269-4c61-8cd6-da76dae23866" containerID="87d67a8d055c3618a15471b1f04db4aaf95d84725ebb64534b3d9a95061cf88d" exitCode=0 Dec 09 04:42:22 crc kubenswrapper[4766]: I1209 04:42:22.614299 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s5krb" event={"ID":"6190f627-a269-4c61-8cd6-da76dae23866","Type":"ContainerDied","Data":"87d67a8d055c3618a15471b1f04db4aaf95d84725ebb64534b3d9a95061cf88d"} Dec 09 04:42:22 crc kubenswrapper[4766]: I1209 04:42:22.614594 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s5krb" event={"ID":"6190f627-a269-4c61-8cd6-da76dae23866","Type":"ContainerStarted","Data":"46419899640d462f9d81f378fc8cb1a9926b53d9226343ea8e39c0c3f864c0f3"} Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.131021 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.142907 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.213939 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb7l9\" (UniqueName: \"kubernetes.io/projected/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-kube-api-access-tb7l9\") pod \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\" (UID: \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\") " Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.214002 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p76t\" (UniqueName: \"kubernetes.io/projected/6190f627-a269-4c61-8cd6-da76dae23866-kube-api-access-8p76t\") pod \"6190f627-a269-4c61-8cd6-da76dae23866\" (UID: \"6190f627-a269-4c61-8cd6-da76dae23866\") " Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.214025 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-operator-scripts\") pod \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\" (UID: \"1b1ebbf5-ae48-47ad-a185-ae686bcc2839\") " Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.214174 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6190f627-a269-4c61-8cd6-da76dae23866-operator-scripts\") pod \"6190f627-a269-4c61-8cd6-da76dae23866\" (UID: \"6190f627-a269-4c61-8cd6-da76dae23866\") " Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.214796 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b1ebbf5-ae48-47ad-a185-ae686bcc2839" (UID: "1b1ebbf5-ae48-47ad-a185-ae686bcc2839"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.214818 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6190f627-a269-4c61-8cd6-da76dae23866-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6190f627-a269-4c61-8cd6-da76dae23866" (UID: "6190f627-a269-4c61-8cd6-da76dae23866"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.215143 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6190f627-a269-4c61-8cd6-da76dae23866-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.215164 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.218842 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-kube-api-access-tb7l9" (OuterVolumeSpecName: "kube-api-access-tb7l9") pod "1b1ebbf5-ae48-47ad-a185-ae686bcc2839" (UID: "1b1ebbf5-ae48-47ad-a185-ae686bcc2839"). InnerVolumeSpecName "kube-api-access-tb7l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.220122 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6190f627-a269-4c61-8cd6-da76dae23866-kube-api-access-8p76t" (OuterVolumeSpecName: "kube-api-access-8p76t") pod "6190f627-a269-4c61-8cd6-da76dae23866" (UID: "6190f627-a269-4c61-8cd6-da76dae23866"). InnerVolumeSpecName "kube-api-access-8p76t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.316408 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb7l9\" (UniqueName: \"kubernetes.io/projected/1b1ebbf5-ae48-47ad-a185-ae686bcc2839-kube-api-access-tb7l9\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.316457 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p76t\" (UniqueName: \"kubernetes.io/projected/6190f627-a269-4c61-8cd6-da76dae23866-kube-api-access-8p76t\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.639519 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-564a-account-create-update-92zz4" event={"ID":"1b1ebbf5-ae48-47ad-a185-ae686bcc2839","Type":"ContainerDied","Data":"12bf988d4ad39993bccc05cf94376371d26a95623f0fafb1ff3cde827fafacbb"} Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.639572 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-564a-account-create-update-92zz4" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.639599 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12bf988d4ad39993bccc05cf94376371d26a95623f0fafb1ff3cde827fafacbb" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.641942 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s5krb" event={"ID":"6190f627-a269-4c61-8cd6-da76dae23866","Type":"ContainerDied","Data":"46419899640d462f9d81f378fc8cb1a9926b53d9226343ea8e39c0c3f864c0f3"} Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.642000 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46419899640d462f9d81f378fc8cb1a9926b53d9226343ea8e39c0c3f864c0f3" Dec 09 04:42:24 crc kubenswrapper[4766]: I1209 04:42:24.641999 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s5krb" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.700339 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jnf67"] Dec 09 04:42:26 crc kubenswrapper[4766]: E1209 04:42:26.701015 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6190f627-a269-4c61-8cd6-da76dae23866" containerName="mariadb-database-create" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.701031 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6190f627-a269-4c61-8cd6-da76dae23866" containerName="mariadb-database-create" Dec 09 04:42:26 crc kubenswrapper[4766]: E1209 04:42:26.701056 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1ebbf5-ae48-47ad-a185-ae686bcc2839" containerName="mariadb-account-create-update" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.701064 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1ebbf5-ae48-47ad-a185-ae686bcc2839" containerName="mariadb-account-create-update" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.701280 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1ebbf5-ae48-47ad-a185-ae686bcc2839" containerName="mariadb-account-create-update" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.701309 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6190f627-a269-4c61-8cd6-da76dae23866" containerName="mariadb-database-create" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.701945 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.703646 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.704546 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.704641 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.705262 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mr7dc" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.710088 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jnf67"] Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.755002 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-config-data\") pod \"keystone-db-sync-jnf67\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.755079 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-combined-ca-bundle\") pod \"keystone-db-sync-jnf67\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.755130 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm98x\" (UniqueName: \"kubernetes.io/projected/68479e12-0d4b-48da-95d0-df49e38051e9-kube-api-access-hm98x\") pod \"keystone-db-sync-jnf67\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.856628 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-combined-ca-bundle\") pod \"keystone-db-sync-jnf67\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.856752 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm98x\" (UniqueName: \"kubernetes.io/projected/68479e12-0d4b-48da-95d0-df49e38051e9-kube-api-access-hm98x\") pod \"keystone-db-sync-jnf67\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.856868 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-config-data\") pod \"keystone-db-sync-jnf67\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.863503 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-config-data\") pod \"keystone-db-sync-jnf67\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.863907 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-combined-ca-bundle\") pod \"keystone-db-sync-jnf67\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:26 crc kubenswrapper[4766]: I1209 04:42:26.883373 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm98x\" (UniqueName: \"kubernetes.io/projected/68479e12-0d4b-48da-95d0-df49e38051e9-kube-api-access-hm98x\") pod \"keystone-db-sync-jnf67\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:27 crc kubenswrapper[4766]: I1209 04:42:27.018883 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:27 crc kubenswrapper[4766]: I1209 04:42:27.453240 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jnf67"] Dec 09 04:42:27 crc kubenswrapper[4766]: I1209 04:42:27.666300 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jnf67" event={"ID":"68479e12-0d4b-48da-95d0-df49e38051e9","Type":"ContainerStarted","Data":"11911ded66752f771d6ad4fa421c3258a74bdbf1afcb8da093e75a37d9112b17"} Dec 09 04:42:27 crc kubenswrapper[4766]: I1209 04:42:27.666342 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jnf67" event={"ID":"68479e12-0d4b-48da-95d0-df49e38051e9","Type":"ContainerStarted","Data":"93c7fa64ec9af5944b795309a6ba2955e4f028c993a11e5bae9fc90106e0d5e8"} Dec 09 04:42:27 crc kubenswrapper[4766]: I1209 04:42:27.693150 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jnf67" podStartSLOduration=1.693123494 podStartE2EDuration="1.693123494s" podCreationTimestamp="2025-12-09 04:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:42:27.681679945 +0000 UTC m=+5429.390985411" watchObservedRunningTime="2025-12-09 04:42:27.693123494 +0000 UTC m=+5429.402428950" Dec 09 04:42:29 crc kubenswrapper[4766]: I1209 04:42:29.683994 4766 generic.go:334] "Generic (PLEG): container finished" podID="68479e12-0d4b-48da-95d0-df49e38051e9" containerID="11911ded66752f771d6ad4fa421c3258a74bdbf1afcb8da093e75a37d9112b17" exitCode=0 Dec 09 04:42:29 crc kubenswrapper[4766]: I1209 04:42:29.684090 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jnf67" event={"ID":"68479e12-0d4b-48da-95d0-df49e38051e9","Type":"ContainerDied","Data":"11911ded66752f771d6ad4fa421c3258a74bdbf1afcb8da093e75a37d9112b17"} Dec 09 04:42:30 crc kubenswrapper[4766]: I1209 04:42:30.491286 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 09 04:42:30 crc kubenswrapper[4766]: I1209 04:42:30.974793 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.124942 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-config-data\") pod \"68479e12-0d4b-48da-95d0-df49e38051e9\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.125058 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm98x\" (UniqueName: \"kubernetes.io/projected/68479e12-0d4b-48da-95d0-df49e38051e9-kube-api-access-hm98x\") pod \"68479e12-0d4b-48da-95d0-df49e38051e9\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.125114 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-combined-ca-bundle\") pod \"68479e12-0d4b-48da-95d0-df49e38051e9\" (UID: \"68479e12-0d4b-48da-95d0-df49e38051e9\") " Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.130828 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68479e12-0d4b-48da-95d0-df49e38051e9-kube-api-access-hm98x" (OuterVolumeSpecName: "kube-api-access-hm98x") pod "68479e12-0d4b-48da-95d0-df49e38051e9" (UID: "68479e12-0d4b-48da-95d0-df49e38051e9"). InnerVolumeSpecName "kube-api-access-hm98x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.149101 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68479e12-0d4b-48da-95d0-df49e38051e9" (UID: "68479e12-0d4b-48da-95d0-df49e38051e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.163886 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-config-data" (OuterVolumeSpecName: "config-data") pod "68479e12-0d4b-48da-95d0-df49e38051e9" (UID: "68479e12-0d4b-48da-95d0-df49e38051e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.227355 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.227387 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68479e12-0d4b-48da-95d0-df49e38051e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.227396 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm98x\" (UniqueName: \"kubernetes.io/projected/68479e12-0d4b-48da-95d0-df49e38051e9-kube-api-access-hm98x\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.705717 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jnf67" event={"ID":"68479e12-0d4b-48da-95d0-df49e38051e9","Type":"ContainerDied","Data":"93c7fa64ec9af5944b795309a6ba2955e4f028c993a11e5bae9fc90106e0d5e8"} Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.706049 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93c7fa64ec9af5944b795309a6ba2955e4f028c993a11e5bae9fc90106e0d5e8" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.705798 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jnf67" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.954889 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-575fbf448f-f4bvn"] Dec 09 04:42:31 crc kubenswrapper[4766]: E1209 04:42:31.957474 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68479e12-0d4b-48da-95d0-df49e38051e9" containerName="keystone-db-sync" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.957503 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="68479e12-0d4b-48da-95d0-df49e38051e9" containerName="keystone-db-sync" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.957872 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="68479e12-0d4b-48da-95d0-df49e38051e9" containerName="keystone-db-sync" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.961669 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:31 crc kubenswrapper[4766]: I1209 04:42:31.967360 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-575fbf448f-f4bvn"] Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.001466 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pgdtr"] Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.002596 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.004554 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.004875 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mr7dc" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.005017 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.005136 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.007192 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.021955 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pgdtr"] Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151095 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh4qp\" (UniqueName: \"kubernetes.io/projected/a78dfbbe-5dbc-49a4-ab73-e86379df691f-kube-api-access-jh4qp\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-dns-svc\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151223 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-sb\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151286 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-credential-keys\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151314 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-nb\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151339 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-fernet-keys\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151362 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwq8t\" (UniqueName: \"kubernetes.io/projected/f63e3801-5ca0-48dd-9125-dded30ae25d9-kube-api-access-hwq8t\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151425 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-config\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151455 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-scripts\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151476 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-config-data\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.151513 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-combined-ca-bundle\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.253919 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh4qp\" (UniqueName: \"kubernetes.io/projected/a78dfbbe-5dbc-49a4-ab73-e86379df691f-kube-api-access-jh4qp\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254023 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-dns-svc\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254046 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-sb\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254090 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-credential-keys\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254117 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-nb\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-fernet-keys\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254167 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwq8t\" (UniqueName: \"kubernetes.io/projected/f63e3801-5ca0-48dd-9125-dded30ae25d9-kube-api-access-hwq8t\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254244 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-config\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254272 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-scripts\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254295 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-config-data\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.254318 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-combined-ca-bundle\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.255031 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-nb\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.255165 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-sb\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.255882 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-dns-svc\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.256260 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-config\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.261116 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-scripts\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.261182 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-fernet-keys\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.261310 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-config-data\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.262866 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-credential-keys\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.262999 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-combined-ca-bundle\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.273353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwq8t\" (UniqueName: \"kubernetes.io/projected/f63e3801-5ca0-48dd-9125-dded30ae25d9-kube-api-access-hwq8t\") pod \"dnsmasq-dns-575fbf448f-f4bvn\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.274353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh4qp\" (UniqueName: \"kubernetes.io/projected/a78dfbbe-5dbc-49a4-ab73-e86379df691f-kube-api-access-jh4qp\") pod \"keystone-bootstrap-pgdtr\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.285807 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.320386 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.824432 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-575fbf448f-f4bvn"] Dec 09 04:42:32 crc kubenswrapper[4766]: W1209 04:42:32.827904 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf63e3801_5ca0_48dd_9125_dded30ae25d9.slice/crio-19497619c6761f0e69093be05d217cef62f08d4d529aa04197b0439cc905f819 WatchSource:0}: Error finding container 19497619c6761f0e69093be05d217cef62f08d4d529aa04197b0439cc905f819: Status 404 returned error can't find the container with id 19497619c6761f0e69093be05d217cef62f08d4d529aa04197b0439cc905f819 Dec 09 04:42:32 crc kubenswrapper[4766]: I1209 04:42:32.916301 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pgdtr"] Dec 09 04:42:32 crc kubenswrapper[4766]: W1209 04:42:32.924600 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda78dfbbe_5dbc_49a4_ab73_e86379df691f.slice/crio-4db09943763e9d45b922863db0c655652108186b22cdacc1a9e5b51bcd190b94 WatchSource:0}: Error finding container 4db09943763e9d45b922863db0c655652108186b22cdacc1a9e5b51bcd190b94: Status 404 returned error can't find the container with id 4db09943763e9d45b922863db0c655652108186b22cdacc1a9e5b51bcd190b94 Dec 09 04:42:33 crc kubenswrapper[4766]: I1209 04:42:33.735467 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" event={"ID":"f63e3801-5ca0-48dd-9125-dded30ae25d9","Type":"ContainerDied","Data":"ea22423636339ab98b985864e440da4ad1c97150b5be5d864a72bd3360533a62"} Dec 09 04:42:33 crc kubenswrapper[4766]: I1209 04:42:33.735479 4766 generic.go:334] "Generic (PLEG): container finished" podID="f63e3801-5ca0-48dd-9125-dded30ae25d9" containerID="ea22423636339ab98b985864e440da4ad1c97150b5be5d864a72bd3360533a62" exitCode=0 Dec 09 04:42:33 crc kubenswrapper[4766]: I1209 04:42:33.735967 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" event={"ID":"f63e3801-5ca0-48dd-9125-dded30ae25d9","Type":"ContainerStarted","Data":"19497619c6761f0e69093be05d217cef62f08d4d529aa04197b0439cc905f819"} Dec 09 04:42:33 crc kubenswrapper[4766]: I1209 04:42:33.739842 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pgdtr" event={"ID":"a78dfbbe-5dbc-49a4-ab73-e86379df691f","Type":"ContainerStarted","Data":"9fa5b938b00031f6434c986a3e0f3db01ac47d9a1e2e840a7ec7f74c073db07c"} Dec 09 04:42:33 crc kubenswrapper[4766]: I1209 04:42:33.739910 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pgdtr" event={"ID":"a78dfbbe-5dbc-49a4-ab73-e86379df691f","Type":"ContainerStarted","Data":"4db09943763e9d45b922863db0c655652108186b22cdacc1a9e5b51bcd190b94"} Dec 09 04:42:33 crc kubenswrapper[4766]: I1209 04:42:33.792157 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pgdtr" podStartSLOduration=2.79213746 podStartE2EDuration="2.79213746s" podCreationTimestamp="2025-12-09 04:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:42:33.788601595 +0000 UTC m=+5435.497907031" watchObservedRunningTime="2025-12-09 04:42:33.79213746 +0000 UTC m=+5435.501442886" Dec 09 04:42:34 crc kubenswrapper[4766]: I1209 04:42:34.757613 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" event={"ID":"f63e3801-5ca0-48dd-9125-dded30ae25d9","Type":"ContainerStarted","Data":"93c07e6ecea6847f1bd0d9e1783b135bb19ca1a993f27aa8a91ba0121e4714cf"} Dec 09 04:42:34 crc kubenswrapper[4766]: I1209 04:42:34.758074 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:34 crc kubenswrapper[4766]: I1209 04:42:34.781820 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" podStartSLOduration=3.781799885 podStartE2EDuration="3.781799885s" podCreationTimestamp="2025-12-09 04:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:42:34.781749354 +0000 UTC m=+5436.491054810" watchObservedRunningTime="2025-12-09 04:42:34.781799885 +0000 UTC m=+5436.491105321" Dec 09 04:42:36 crc kubenswrapper[4766]: I1209 04:42:36.777301 4766 generic.go:334] "Generic (PLEG): container finished" podID="a78dfbbe-5dbc-49a4-ab73-e86379df691f" containerID="9fa5b938b00031f6434c986a3e0f3db01ac47d9a1e2e840a7ec7f74c073db07c" exitCode=0 Dec 09 04:42:36 crc kubenswrapper[4766]: I1209 04:42:36.777400 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pgdtr" event={"ID":"a78dfbbe-5dbc-49a4-ab73-e86379df691f","Type":"ContainerDied","Data":"9fa5b938b00031f6434c986a3e0f3db01ac47d9a1e2e840a7ec7f74c073db07c"} Dec 09 04:42:37 crc kubenswrapper[4766]: I1209 04:42:37.316629 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:42:37 crc kubenswrapper[4766]: I1209 04:42:37.316690 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.206685 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.396125 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-config-data\") pod \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.396246 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-combined-ca-bundle\") pod \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.396352 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-fernet-keys\") pod \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.396383 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-scripts\") pod \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.396451 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-credential-keys\") pod \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.396494 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh4qp\" (UniqueName: \"kubernetes.io/projected/a78dfbbe-5dbc-49a4-ab73-e86379df691f-kube-api-access-jh4qp\") pod \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\" (UID: \"a78dfbbe-5dbc-49a4-ab73-e86379df691f\") " Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.402733 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-scripts" (OuterVolumeSpecName: "scripts") pod "a78dfbbe-5dbc-49a4-ab73-e86379df691f" (UID: "a78dfbbe-5dbc-49a4-ab73-e86379df691f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.403953 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a78dfbbe-5dbc-49a4-ab73-e86379df691f" (UID: "a78dfbbe-5dbc-49a4-ab73-e86379df691f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.404785 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a78dfbbe-5dbc-49a4-ab73-e86379df691f" (UID: "a78dfbbe-5dbc-49a4-ab73-e86379df691f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.405026 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78dfbbe-5dbc-49a4-ab73-e86379df691f-kube-api-access-jh4qp" (OuterVolumeSpecName: "kube-api-access-jh4qp") pod "a78dfbbe-5dbc-49a4-ab73-e86379df691f" (UID: "a78dfbbe-5dbc-49a4-ab73-e86379df691f"). InnerVolumeSpecName "kube-api-access-jh4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.429011 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a78dfbbe-5dbc-49a4-ab73-e86379df691f" (UID: "a78dfbbe-5dbc-49a4-ab73-e86379df691f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.436526 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-config-data" (OuterVolumeSpecName: "config-data") pod "a78dfbbe-5dbc-49a4-ab73-e86379df691f" (UID: "a78dfbbe-5dbc-49a4-ab73-e86379df691f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.498500 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.498582 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.498604 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.498622 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.498642 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78dfbbe-5dbc-49a4-ab73-e86379df691f-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.498658 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh4qp\" (UniqueName: \"kubernetes.io/projected/a78dfbbe-5dbc-49a4-ab73-e86379df691f-kube-api-access-jh4qp\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.796601 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pgdtr" event={"ID":"a78dfbbe-5dbc-49a4-ab73-e86379df691f","Type":"ContainerDied","Data":"4db09943763e9d45b922863db0c655652108186b22cdacc1a9e5b51bcd190b94"} Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.796985 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db09943763e9d45b922863db0c655652108186b22cdacc1a9e5b51bcd190b94" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.796676 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pgdtr" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.904596 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pgdtr"] Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.913070 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pgdtr"] Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.948828 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-496cq"] Dec 09 04:42:38 crc kubenswrapper[4766]: E1209 04:42:38.949264 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78dfbbe-5dbc-49a4-ab73-e86379df691f" containerName="keystone-bootstrap" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.949287 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78dfbbe-5dbc-49a4-ab73-e86379df691f" containerName="keystone-bootstrap" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.949619 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78dfbbe-5dbc-49a4-ab73-e86379df691f" containerName="keystone-bootstrap" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.950424 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.954091 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.954375 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.954683 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.956463 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.957108 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mr7dc" Dec 09 04:42:38 crc kubenswrapper[4766]: I1209 04:42:38.961688 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-496cq"] Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.110381 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-scripts\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.110609 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-config-data\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.110661 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-combined-ca-bundle\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.110735 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpcdk\" (UniqueName: \"kubernetes.io/projected/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-kube-api-access-xpcdk\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.110805 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-credential-keys\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.110873 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-fernet-keys\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.212565 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-scripts\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.212701 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-config-data\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.212726 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-combined-ca-bundle\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.212743 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpcdk\" (UniqueName: \"kubernetes.io/projected/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-kube-api-access-xpcdk\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.212769 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-credential-keys\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.212788 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-fernet-keys\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.218753 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-scripts\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.219785 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-fernet-keys\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.219799 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-config-data\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.225800 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-credential-keys\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.226803 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-combined-ca-bundle\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.239951 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpcdk\" (UniqueName: \"kubernetes.io/projected/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-kube-api-access-xpcdk\") pod \"keystone-bootstrap-496cq\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.305297 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.724012 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-496cq"] Dec 09 04:42:39 crc kubenswrapper[4766]: I1209 04:42:39.808725 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-496cq" event={"ID":"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6","Type":"ContainerStarted","Data":"89aba0e0a9db63ee52a352e149611dddf5fd5234497e0df1bd3a15969209fd91"} Dec 09 04:42:40 crc kubenswrapper[4766]: I1209 04:42:40.822022 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-496cq" event={"ID":"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6","Type":"ContainerStarted","Data":"4727821be4240c408efe814f03824c0895de2af772d2a3249fb54d395992ea1a"} Dec 09 04:42:40 crc kubenswrapper[4766]: I1209 04:42:40.851628 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-496cq" podStartSLOduration=2.851580981 podStartE2EDuration="2.851580981s" podCreationTimestamp="2025-12-09 04:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:42:40.843330858 +0000 UTC m=+5442.552636344" watchObservedRunningTime="2025-12-09 04:42:40.851580981 +0000 UTC m=+5442.560886417" Dec 09 04:42:40 crc kubenswrapper[4766]: I1209 04:42:40.860391 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78dfbbe-5dbc-49a4-ab73-e86379df691f" path="/var/lib/kubelet/pods/a78dfbbe-5dbc-49a4-ab73-e86379df691f/volumes" Dec 09 04:42:42 crc kubenswrapper[4766]: I1209 04:42:42.288479 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:42:42 crc kubenswrapper[4766]: I1209 04:42:42.355171 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57866479cc-ggtfs"] Dec 09 04:42:42 crc kubenswrapper[4766]: I1209 04:42:42.355402 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" podUID="ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" containerName="dnsmasq-dns" containerID="cri-o://f5483d93e746ddd82f7d5f1f5469013f4385b0294c5e26426a5f8f9e42e61abe" gracePeriod=10 Dec 09 04:42:42 crc kubenswrapper[4766]: I1209 04:42:42.894418 4766 generic.go:334] "Generic (PLEG): container finished" podID="ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" containerID="f5483d93e746ddd82f7d5f1f5469013f4385b0294c5e26426a5f8f9e42e61abe" exitCode=0 Dec 09 04:42:42 crc kubenswrapper[4766]: I1209 04:42:42.894503 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" event={"ID":"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70","Type":"ContainerDied","Data":"f5483d93e746ddd82f7d5f1f5469013f4385b0294c5e26426a5f8f9e42e61abe"} Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.309658 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.405764 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-config\") pod \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.405868 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf5sw\" (UniqueName: \"kubernetes.io/projected/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-kube-api-access-rf5sw\") pod \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.405951 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-dns-svc\") pod \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.406953 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-sb\") pod \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.407043 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-nb\") pod \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\" (UID: \"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70\") " Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.411771 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-kube-api-access-rf5sw" (OuterVolumeSpecName: "kube-api-access-rf5sw") pod "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" (UID: "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70"). InnerVolumeSpecName "kube-api-access-rf5sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.446176 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" (UID: "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.449909 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" (UID: "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.464969 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" (UID: "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.465691 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-config" (OuterVolumeSpecName: "config") pod "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" (UID: "ebe97171-cf11-423a-a1a0-5fd7ef6cbf70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.509096 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.509127 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.509138 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf5sw\" (UniqueName: \"kubernetes.io/projected/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-kube-api-access-rf5sw\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.509147 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.509155 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.910248 4766 generic.go:334] "Generic (PLEG): container finished" podID="403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" containerID="4727821be4240c408efe814f03824c0895de2af772d2a3249fb54d395992ea1a" exitCode=0 Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.910345 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-496cq" event={"ID":"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6","Type":"ContainerDied","Data":"4727821be4240c408efe814f03824c0895de2af772d2a3249fb54d395992ea1a"} Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.914137 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" event={"ID":"ebe97171-cf11-423a-a1a0-5fd7ef6cbf70","Type":"ContainerDied","Data":"7f7557e85ea5673c6647922f00369171fb2fb8a1cd4fe054ad83d165908f9865"} Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.914200 4766 scope.go:117] "RemoveContainer" containerID="f5483d93e746ddd82f7d5f1f5469013f4385b0294c5e26426a5f8f9e42e61abe" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.914256 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57866479cc-ggtfs" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.968180 4766 scope.go:117] "RemoveContainer" containerID="152012238b5fe439699245bd8b5544421a1ef1a4a6a4024864ebc17bec546d19" Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.972643 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57866479cc-ggtfs"] Dec 09 04:42:43 crc kubenswrapper[4766]: I1209 04:42:43.981636 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57866479cc-ggtfs"] Dec 09 04:42:44 crc kubenswrapper[4766]: I1209 04:42:44.853401 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" path="/var/lib/kubelet/pods/ebe97171-cf11-423a-a1a0-5fd7ef6cbf70/volumes" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.277243 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.356121 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpcdk\" (UniqueName: \"kubernetes.io/projected/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-kube-api-access-xpcdk\") pod \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.356416 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-config-data\") pod \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.356493 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-combined-ca-bundle\") pod \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.356522 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-fernet-keys\") pod \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.356565 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-scripts\") pod \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.356606 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-credential-keys\") pod \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\" (UID: \"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6\") " Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.360916 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-scripts" (OuterVolumeSpecName: "scripts") pod "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" (UID: "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.361100 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-kube-api-access-xpcdk" (OuterVolumeSpecName: "kube-api-access-xpcdk") pod "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" (UID: "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6"). InnerVolumeSpecName "kube-api-access-xpcdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.361436 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" (UID: "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.361916 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" (UID: "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.378646 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-config-data" (OuterVolumeSpecName: "config-data") pod "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" (UID: "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.381341 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" (UID: "403f07a0-cdb0-4cf1-8d04-9555b8e82fb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.458986 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.459043 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.459059 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.459070 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.459080 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.459090 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpcdk\" (UniqueName: \"kubernetes.io/projected/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6-kube-api-access-xpcdk\") on node \"crc\" DevicePath \"\"" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.938945 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-496cq" event={"ID":"403f07a0-cdb0-4cf1-8d04-9555b8e82fb6","Type":"ContainerDied","Data":"89aba0e0a9db63ee52a352e149611dddf5fd5234497e0df1bd3a15969209fd91"} Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.938993 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89aba0e0a9db63ee52a352e149611dddf5fd5234497e0df1bd3a15969209fd91" Dec 09 04:42:45 crc kubenswrapper[4766]: I1209 04:42:45.939040 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-496cq" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.075810 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8b8987d78-zlv6w"] Dec 09 04:42:46 crc kubenswrapper[4766]: E1209 04:42:46.076950 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" containerName="dnsmasq-dns" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.077001 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" containerName="dnsmasq-dns" Dec 09 04:42:46 crc kubenswrapper[4766]: E1209 04:42:46.077027 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" containerName="keystone-bootstrap" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.077044 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" containerName="keystone-bootstrap" Dec 09 04:42:46 crc kubenswrapper[4766]: E1209 04:42:46.077070 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" containerName="init" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.077086 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" containerName="init" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.077489 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe97171-cf11-423a-a1a0-5fd7ef6cbf70" containerName="dnsmasq-dns" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.077544 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" containerName="keystone-bootstrap" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.078803 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.088471 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.088558 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.088601 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.088771 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mr7dc" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.094566 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8b8987d78-zlv6w"] Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.174183 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-config-data\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.174256 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-credential-keys\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.174280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljqpn\" (UniqueName: \"kubernetes.io/projected/c4c26389-5cc4-421f-b5f4-6135179a1ef7-kube-api-access-ljqpn\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.174361 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-combined-ca-bundle\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.174440 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-fernet-keys\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.174472 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-scripts\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.275727 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-combined-ca-bundle\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.275827 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-fernet-keys\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.275867 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-scripts\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.275908 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-config-data\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.275924 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-credential-keys\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.275942 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljqpn\" (UniqueName: \"kubernetes.io/projected/c4c26389-5cc4-421f-b5f4-6135179a1ef7-kube-api-access-ljqpn\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.280373 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-scripts\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.281122 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-fernet-keys\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.283832 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-config-data\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.284827 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-credential-keys\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.289034 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4c26389-5cc4-421f-b5f4-6135179a1ef7-combined-ca-bundle\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.298609 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljqpn\" (UniqueName: \"kubernetes.io/projected/c4c26389-5cc4-421f-b5f4-6135179a1ef7-kube-api-access-ljqpn\") pod \"keystone-8b8987d78-zlv6w\" (UID: \"c4c26389-5cc4-421f-b5f4-6135179a1ef7\") " pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.462025 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.892314 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8b8987d78-zlv6w"] Dec 09 04:42:46 crc kubenswrapper[4766]: I1209 04:42:46.952970 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8b8987d78-zlv6w" event={"ID":"c4c26389-5cc4-421f-b5f4-6135179a1ef7","Type":"ContainerStarted","Data":"74c4caa9e4fcf82e70accf5204d2b09592585d51782c41d2eeddd6a49615d2fe"} Dec 09 04:42:47 crc kubenswrapper[4766]: I1209 04:42:47.961731 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8b8987d78-zlv6w" event={"ID":"c4c26389-5cc4-421f-b5f4-6135179a1ef7","Type":"ContainerStarted","Data":"a2819916ef21a2955e3750639d9d3afc5b2c69d1357e4473fabe2432c27abaa8"} Dec 09 04:42:47 crc kubenswrapper[4766]: I1209 04:42:47.962638 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:42:47 crc kubenswrapper[4766]: I1209 04:42:47.993182 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8b8987d78-zlv6w" podStartSLOduration=1.993154594 podStartE2EDuration="1.993154594s" podCreationTimestamp="2025-12-09 04:42:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:42:47.982112885 +0000 UTC m=+5449.691418311" watchObservedRunningTime="2025-12-09 04:42:47.993154594 +0000 UTC m=+5449.702460040" Dec 09 04:43:07 crc kubenswrapper[4766]: I1209 04:43:07.316932 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:43:07 crc kubenswrapper[4766]: I1209 04:43:07.317538 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:43:07 crc kubenswrapper[4766]: I1209 04:43:07.317591 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:43:07 crc kubenswrapper[4766]: I1209 04:43:07.318244 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"caec6864870b555babc6bdf649e5fc84b874ca528991ac1003b50ad5ab1fca38"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:43:07 crc kubenswrapper[4766]: I1209 04:43:07.318308 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://caec6864870b555babc6bdf649e5fc84b874ca528991ac1003b50ad5ab1fca38" gracePeriod=600 Dec 09 04:43:08 crc kubenswrapper[4766]: I1209 04:43:08.157726 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="caec6864870b555babc6bdf649e5fc84b874ca528991ac1003b50ad5ab1fca38" exitCode=0 Dec 09 04:43:08 crc kubenswrapper[4766]: I1209 04:43:08.157874 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"caec6864870b555babc6bdf649e5fc84b874ca528991ac1003b50ad5ab1fca38"} Dec 09 04:43:08 crc kubenswrapper[4766]: I1209 04:43:08.158365 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d"} Dec 09 04:43:08 crc kubenswrapper[4766]: I1209 04:43:08.158416 4766 scope.go:117] "RemoveContainer" containerID="204c0f0abe36e7e3fb1d4036458a43f3dbe96d4aea5373973130af6602ea8cf9" Dec 09 04:43:14 crc kubenswrapper[4766]: I1209 04:43:14.030365 4766 scope.go:117] "RemoveContainer" containerID="5f3352328b8a9498de6d80ab3dd6bc6dae3b08ddc1cb9f4a336e62d703ed8337" Dec 09 04:43:17 crc kubenswrapper[4766]: I1209 04:43:17.987392 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8b8987d78-zlv6w" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.100357 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.105774 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.111516 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.111748 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.111963 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-45v4s" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.115558 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.125851 4766 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c488622f-1c8f-4545-bb92-63c7007713db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T04:43:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T04:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T04:43:21Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T04:43:21Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lhzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T04:43:21Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.129289 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.137527 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 09 04:43:21 crc kubenswrapper[4766]: E1209 04:43:21.146739 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7lhzn openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="c488622f-1c8f-4545-bb92-63c7007713db" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.169493 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.170579 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.179808 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.196735 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c488622f-1c8f-4545-bb92-63c7007713db" podUID="1efe8f28-9359-4214-911b-594db568c9a5" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.207087 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhzn\" (UniqueName: \"kubernetes.io/projected/c488622f-1c8f-4545-bb92-63c7007713db-kube-api-access-7lhzn\") pod \"openstackclient\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.207140 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config\") pod \"openstackclient\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.207158 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config-secret\") pod \"openstackclient\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.296068 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.299473 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c488622f-1c8f-4545-bb92-63c7007713db" podUID="1efe8f28-9359-4214-911b-594db568c9a5" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.305803 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.310206 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config\") pod \"openstackclient\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.310270 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config-secret\") pod \"openstackclient\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.310311 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fqq\" (UniqueName: \"kubernetes.io/projected/1efe8f28-9359-4214-911b-594db568c9a5-kube-api-access-z5fqq\") pod \"openstackclient\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.310454 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config-secret\") pod \"openstackclient\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.310498 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config\") pod \"openstackclient\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.310522 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhzn\" (UniqueName: \"kubernetes.io/projected/c488622f-1c8f-4545-bb92-63c7007713db-kube-api-access-7lhzn\") pod \"openstackclient\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.311117 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c488622f-1c8f-4545-bb92-63c7007713db" podUID="1efe8f28-9359-4214-911b-594db568c9a5" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.311232 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config\") pod \"openstackclient\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: E1209 04:43:21.312550 4766 projected.go:194] Error preparing data for projected volume kube-api-access-7lhzn for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c488622f-1c8f-4545-bb92-63c7007713db) does not match the UID in record. The object might have been deleted and then recreated Dec 09 04:43:21 crc kubenswrapper[4766]: E1209 04:43:21.312599 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c488622f-1c8f-4545-bb92-63c7007713db-kube-api-access-7lhzn podName:c488622f-1c8f-4545-bb92-63c7007713db nodeName:}" failed. No retries permitted until 2025-12-09 04:43:21.81258377 +0000 UTC m=+5483.521889196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7lhzn" (UniqueName: "kubernetes.io/projected/c488622f-1c8f-4545-bb92-63c7007713db-kube-api-access-7lhzn") pod "openstackclient" (UID: "c488622f-1c8f-4545-bb92-63c7007713db") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c488622f-1c8f-4545-bb92-63c7007713db) does not match the UID in record. The object might have been deleted and then recreated Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.315709 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config-secret\") pod \"openstackclient\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.412013 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config\") pod \"c488622f-1c8f-4545-bb92-63c7007713db\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.412436 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config-secret\") pod \"c488622f-1c8f-4545-bb92-63c7007713db\" (UID: \"c488622f-1c8f-4545-bb92-63c7007713db\") " Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.412530 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c488622f-1c8f-4545-bb92-63c7007713db" (UID: "c488622f-1c8f-4545-bb92-63c7007713db"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.412739 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config-secret\") pod \"openstackclient\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.412782 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config\") pod \"openstackclient\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.412839 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5fqq\" (UniqueName: \"kubernetes.io/projected/1efe8f28-9359-4214-911b-594db568c9a5-kube-api-access-z5fqq\") pod \"openstackclient\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.412964 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lhzn\" (UniqueName: \"kubernetes.io/projected/c488622f-1c8f-4545-bb92-63c7007713db-kube-api-access-7lhzn\") on node \"crc\" DevicePath \"\"" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.412979 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.413920 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config\") pod \"openstackclient\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.415689 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c488622f-1c8f-4545-bb92-63c7007713db" (UID: "c488622f-1c8f-4545-bb92-63c7007713db"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.416852 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config-secret\") pod \"openstackclient\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.433835 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5fqq\" (UniqueName: \"kubernetes.io/projected/1efe8f28-9359-4214-911b-594db568c9a5-kube-api-access-z5fqq\") pod \"openstackclient\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.501429 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:43:21 crc kubenswrapper[4766]: I1209 04:43:21.515203 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c488622f-1c8f-4545-bb92-63c7007713db-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 09 04:43:22 crc kubenswrapper[4766]: I1209 04:43:22.030903 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 04:43:22 crc kubenswrapper[4766]: I1209 04:43:22.310273 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1efe8f28-9359-4214-911b-594db568c9a5","Type":"ContainerStarted","Data":"eda57d876cfc768360846060dd21b3ebad99c69d1a82fda52cbc45b7acfaf5b1"} Dec 09 04:43:22 crc kubenswrapper[4766]: I1209 04:43:22.310332 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:43:22 crc kubenswrapper[4766]: I1209 04:43:22.310353 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1efe8f28-9359-4214-911b-594db568c9a5","Type":"ContainerStarted","Data":"4a8d7ee0aae8192c225834dbcb873d1dcd095cc876edf5fafd922a995b84c339"} Dec 09 04:43:22 crc kubenswrapper[4766]: I1209 04:43:22.342092 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c488622f-1c8f-4545-bb92-63c7007713db" podUID="1efe8f28-9359-4214-911b-594db568c9a5" Dec 09 04:43:22 crc kubenswrapper[4766]: I1209 04:43:22.342325 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.342306989 podStartE2EDuration="1.342306989s" podCreationTimestamp="2025-12-09 04:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:43:22.335126535 +0000 UTC m=+5484.044431971" watchObservedRunningTime="2025-12-09 04:43:22.342306989 +0000 UTC m=+5484.051612425" Dec 09 04:43:22 crc kubenswrapper[4766]: I1209 04:43:22.397700 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c488622f-1c8f-4545-bb92-63c7007713db" podUID="1efe8f28-9359-4214-911b-594db568c9a5" Dec 09 04:43:22 crc kubenswrapper[4766]: I1209 04:43:22.851105 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c488622f-1c8f-4545-bb92-63c7007713db" path="/var/lib/kubelet/pods/c488622f-1c8f-4545-bb92-63c7007713db/volumes" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.396783 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p2fh5"] Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.401440 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.419143 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p2fh5"] Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.516790 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-catalog-content\") pod \"redhat-operators-p2fh5\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.516874 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-utilities\") pod \"redhat-operators-p2fh5\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.516940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74cgr\" (UniqueName: \"kubernetes.io/projected/683287e5-c854-449e-bf57-1a59a15ebe68-kube-api-access-74cgr\") pod \"redhat-operators-p2fh5\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.618046 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-catalog-content\") pod \"redhat-operators-p2fh5\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.618130 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-utilities\") pod \"redhat-operators-p2fh5\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.618183 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74cgr\" (UniqueName: \"kubernetes.io/projected/683287e5-c854-449e-bf57-1a59a15ebe68-kube-api-access-74cgr\") pod \"redhat-operators-p2fh5\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.618965 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-utilities\") pod \"redhat-operators-p2fh5\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.619355 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-catalog-content\") pod \"redhat-operators-p2fh5\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.637393 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74cgr\" (UniqueName: \"kubernetes.io/projected/683287e5-c854-449e-bf57-1a59a15ebe68-kube-api-access-74cgr\") pod \"redhat-operators-p2fh5\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:43 crc kubenswrapper[4766]: I1209 04:43:43.723776 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:44 crc kubenswrapper[4766]: I1209 04:43:44.183044 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p2fh5"] Dec 09 04:43:44 crc kubenswrapper[4766]: I1209 04:43:44.529196 4766 generic.go:334] "Generic (PLEG): container finished" podID="683287e5-c854-449e-bf57-1a59a15ebe68" containerID="a03653dc86780941406a1b08f7b58f6ce807ccfd0ffbb29b0099e42dabd2a770" exitCode=0 Dec 09 04:43:44 crc kubenswrapper[4766]: I1209 04:43:44.529304 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2fh5" event={"ID":"683287e5-c854-449e-bf57-1a59a15ebe68","Type":"ContainerDied","Data":"a03653dc86780941406a1b08f7b58f6ce807ccfd0ffbb29b0099e42dabd2a770"} Dec 09 04:43:44 crc kubenswrapper[4766]: I1209 04:43:44.529520 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2fh5" event={"ID":"683287e5-c854-449e-bf57-1a59a15ebe68","Type":"ContainerStarted","Data":"834d30901b034ac622018ef10e75c5ece26a1d56a0556801391e75ab1a56f32f"} Dec 09 04:43:44 crc kubenswrapper[4766]: I1209 04:43:44.531168 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 04:43:45 crc kubenswrapper[4766]: I1209 04:43:45.538602 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2fh5" event={"ID":"683287e5-c854-449e-bf57-1a59a15ebe68","Type":"ContainerStarted","Data":"116283d91fb644827998cb576c18b4e03a5927404cd765faa5f1b69a7f234924"} Dec 09 04:43:46 crc kubenswrapper[4766]: I1209 04:43:46.551658 4766 generic.go:334] "Generic (PLEG): container finished" podID="683287e5-c854-449e-bf57-1a59a15ebe68" containerID="116283d91fb644827998cb576c18b4e03a5927404cd765faa5f1b69a7f234924" exitCode=0 Dec 09 04:43:46 crc kubenswrapper[4766]: I1209 04:43:46.551702 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2fh5" event={"ID":"683287e5-c854-449e-bf57-1a59a15ebe68","Type":"ContainerDied","Data":"116283d91fb644827998cb576c18b4e03a5927404cd765faa5f1b69a7f234924"} Dec 09 04:43:47 crc kubenswrapper[4766]: I1209 04:43:47.560713 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2fh5" event={"ID":"683287e5-c854-449e-bf57-1a59a15ebe68","Type":"ContainerStarted","Data":"87157f1487a35ca5176a8b1b0cadb492f9e9c3fd8b5fa400cb07ae0629bf850a"} Dec 09 04:43:47 crc kubenswrapper[4766]: I1209 04:43:47.586562 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p2fh5" podStartSLOduration=2.150564798 podStartE2EDuration="4.586547116s" podCreationTimestamp="2025-12-09 04:43:43 +0000 UTC" firstStartedPulling="2025-12-09 04:43:44.530920526 +0000 UTC m=+5506.240225952" lastFinishedPulling="2025-12-09 04:43:46.966902844 +0000 UTC m=+5508.676208270" observedRunningTime="2025-12-09 04:43:47.581622582 +0000 UTC m=+5509.290928018" watchObservedRunningTime="2025-12-09 04:43:47.586547116 +0000 UTC m=+5509.295852542" Dec 09 04:43:53 crc kubenswrapper[4766]: I1209 04:43:53.724608 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:53 crc kubenswrapper[4766]: I1209 04:43:53.724971 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:53 crc kubenswrapper[4766]: I1209 04:43:53.782531 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:54 crc kubenswrapper[4766]: I1209 04:43:54.657639 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:54 crc kubenswrapper[4766]: I1209 04:43:54.721410 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p2fh5"] Dec 09 04:43:56 crc kubenswrapper[4766]: I1209 04:43:56.628149 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p2fh5" podUID="683287e5-c854-449e-bf57-1a59a15ebe68" containerName="registry-server" containerID="cri-o://87157f1487a35ca5176a8b1b0cadb492f9e9c3fd8b5fa400cb07ae0629bf850a" gracePeriod=2 Dec 09 04:43:58 crc kubenswrapper[4766]: I1209 04:43:58.658707 4766 generic.go:334] "Generic (PLEG): container finished" podID="683287e5-c854-449e-bf57-1a59a15ebe68" containerID="87157f1487a35ca5176a8b1b0cadb492f9e9c3fd8b5fa400cb07ae0629bf850a" exitCode=0 Dec 09 04:43:58 crc kubenswrapper[4766]: I1209 04:43:58.659381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2fh5" event={"ID":"683287e5-c854-449e-bf57-1a59a15ebe68","Type":"ContainerDied","Data":"87157f1487a35ca5176a8b1b0cadb492f9e9c3fd8b5fa400cb07ae0629bf850a"} Dec 09 04:43:58 crc kubenswrapper[4766]: I1209 04:43:58.896690 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:58 crc kubenswrapper[4766]: I1209 04:43:58.982057 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-utilities\") pod \"683287e5-c854-449e-bf57-1a59a15ebe68\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " Dec 09 04:43:58 crc kubenswrapper[4766]: I1209 04:43:58.982278 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74cgr\" (UniqueName: \"kubernetes.io/projected/683287e5-c854-449e-bf57-1a59a15ebe68-kube-api-access-74cgr\") pod \"683287e5-c854-449e-bf57-1a59a15ebe68\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " Dec 09 04:43:58 crc kubenswrapper[4766]: I1209 04:43:58.982396 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-catalog-content\") pod \"683287e5-c854-449e-bf57-1a59a15ebe68\" (UID: \"683287e5-c854-449e-bf57-1a59a15ebe68\") " Dec 09 04:43:58 crc kubenswrapper[4766]: I1209 04:43:58.982688 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-utilities" (OuterVolumeSpecName: "utilities") pod "683287e5-c854-449e-bf57-1a59a15ebe68" (UID: "683287e5-c854-449e-bf57-1a59a15ebe68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:43:58 crc kubenswrapper[4766]: I1209 04:43:58.983297 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:43:58 crc kubenswrapper[4766]: I1209 04:43:58.990559 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/683287e5-c854-449e-bf57-1a59a15ebe68-kube-api-access-74cgr" (OuterVolumeSpecName: "kube-api-access-74cgr") pod "683287e5-c854-449e-bf57-1a59a15ebe68" (UID: "683287e5-c854-449e-bf57-1a59a15ebe68"). InnerVolumeSpecName "kube-api-access-74cgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.084487 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "683287e5-c854-449e-bf57-1a59a15ebe68" (UID: "683287e5-c854-449e-bf57-1a59a15ebe68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.085279 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/683287e5-c854-449e-bf57-1a59a15ebe68-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.085312 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74cgr\" (UniqueName: \"kubernetes.io/projected/683287e5-c854-449e-bf57-1a59a15ebe68-kube-api-access-74cgr\") on node \"crc\" DevicePath \"\"" Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.685418 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2fh5" event={"ID":"683287e5-c854-449e-bf57-1a59a15ebe68","Type":"ContainerDied","Data":"834d30901b034ac622018ef10e75c5ece26a1d56a0556801391e75ab1a56f32f"} Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.685784 4766 scope.go:117] "RemoveContainer" containerID="87157f1487a35ca5176a8b1b0cadb492f9e9c3fd8b5fa400cb07ae0629bf850a" Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.685955 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2fh5" Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.744489 4766 scope.go:117] "RemoveContainer" containerID="116283d91fb644827998cb576c18b4e03a5927404cd765faa5f1b69a7f234924" Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.751240 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p2fh5"] Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.777403 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p2fh5"] Dec 09 04:43:59 crc kubenswrapper[4766]: I1209 04:43:59.798364 4766 scope.go:117] "RemoveContainer" containerID="a03653dc86780941406a1b08f7b58f6ce807ccfd0ffbb29b0099e42dabd2a770" Dec 09 04:44:00 crc kubenswrapper[4766]: I1209 04:44:00.849409 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="683287e5-c854-449e-bf57-1a59a15ebe68" path="/var/lib/kubelet/pods/683287e5-c854-449e-bf57-1a59a15ebe68/volumes" Dec 09 04:44:44 crc kubenswrapper[4766]: E1209 04:44:44.874747 4766 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.196:40276->38.102.83.196:40815: read tcp 38.102.83.196:40276->38.102.83.196:40815: read: connection reset by peer Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.155049 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6"] Dec 09 04:45:00 crc kubenswrapper[4766]: E1209 04:45:00.155943 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683287e5-c854-449e-bf57-1a59a15ebe68" containerName="registry-server" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.155958 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="683287e5-c854-449e-bf57-1a59a15ebe68" containerName="registry-server" Dec 09 04:45:00 crc kubenswrapper[4766]: E1209 04:45:00.155998 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683287e5-c854-449e-bf57-1a59a15ebe68" containerName="extract-content" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.156012 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="683287e5-c854-449e-bf57-1a59a15ebe68" containerName="extract-content" Dec 09 04:45:00 crc kubenswrapper[4766]: E1209 04:45:00.156055 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683287e5-c854-449e-bf57-1a59a15ebe68" containerName="extract-utilities" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.156064 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="683287e5-c854-449e-bf57-1a59a15ebe68" containerName="extract-utilities" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.156332 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="683287e5-c854-449e-bf57-1a59a15ebe68" containerName="registry-server" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.157060 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.160380 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.160653 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.169959 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6"] Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.283535 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvlns\" (UniqueName: \"kubernetes.io/projected/c3861350-bc7d-4211-a4ba-fe76779c97f7-kube-api-access-mvlns\") pod \"collect-profiles-29420925-npxr6\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.283606 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3861350-bc7d-4211-a4ba-fe76779c97f7-config-volume\") pod \"collect-profiles-29420925-npxr6\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.283630 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3861350-bc7d-4211-a4ba-fe76779c97f7-secret-volume\") pod \"collect-profiles-29420925-npxr6\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.385460 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvlns\" (UniqueName: \"kubernetes.io/projected/c3861350-bc7d-4211-a4ba-fe76779c97f7-kube-api-access-mvlns\") pod \"collect-profiles-29420925-npxr6\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.385526 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3861350-bc7d-4211-a4ba-fe76779c97f7-config-volume\") pod \"collect-profiles-29420925-npxr6\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.385549 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3861350-bc7d-4211-a4ba-fe76779c97f7-secret-volume\") pod \"collect-profiles-29420925-npxr6\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.386825 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3861350-bc7d-4211-a4ba-fe76779c97f7-config-volume\") pod \"collect-profiles-29420925-npxr6\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.401249 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3861350-bc7d-4211-a4ba-fe76779c97f7-secret-volume\") pod \"collect-profiles-29420925-npxr6\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.405268 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvlns\" (UniqueName: \"kubernetes.io/projected/c3861350-bc7d-4211-a4ba-fe76779c97f7-kube-api-access-mvlns\") pod \"collect-profiles-29420925-npxr6\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.482090 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:00 crc kubenswrapper[4766]: I1209 04:45:00.963810 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6"] Dec 09 04:45:01 crc kubenswrapper[4766]: I1209 04:45:01.231407 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" event={"ID":"c3861350-bc7d-4211-a4ba-fe76779c97f7","Type":"ContainerStarted","Data":"a5921f833aef397b51a6560d82869249a07eff39811a083068a2f713dced4e64"} Dec 09 04:45:01 crc kubenswrapper[4766]: I1209 04:45:01.231826 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" event={"ID":"c3861350-bc7d-4211-a4ba-fe76779c97f7","Type":"ContainerStarted","Data":"5f9c5b5c2cebf1812e970d1102c0f8219c9fd8feaf9caa8546a84e7885c542e7"} Dec 09 04:45:01 crc kubenswrapper[4766]: I1209 04:45:01.252945 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" podStartSLOduration=1.252924004 podStartE2EDuration="1.252924004s" podCreationTimestamp="2025-12-09 04:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:01.247499637 +0000 UTC m=+5582.956805063" watchObservedRunningTime="2025-12-09 04:45:01.252924004 +0000 UTC m=+5582.962229430" Dec 09 04:45:02 crc kubenswrapper[4766]: I1209 04:45:02.243593 4766 generic.go:334] "Generic (PLEG): container finished" podID="c3861350-bc7d-4211-a4ba-fe76779c97f7" containerID="a5921f833aef397b51a6560d82869249a07eff39811a083068a2f713dced4e64" exitCode=0 Dec 09 04:45:02 crc kubenswrapper[4766]: I1209 04:45:02.243731 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" event={"ID":"c3861350-bc7d-4211-a4ba-fe76779c97f7","Type":"ContainerDied","Data":"a5921f833aef397b51a6560d82869249a07eff39811a083068a2f713dced4e64"} Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.602015 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.688372 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4tj69"] Dec 09 04:45:03 crc kubenswrapper[4766]: E1209 04:45:03.688706 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3861350-bc7d-4211-a4ba-fe76779c97f7" containerName="collect-profiles" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.688721 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3861350-bc7d-4211-a4ba-fe76779c97f7" containerName="collect-profiles" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.688872 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3861350-bc7d-4211-a4ba-fe76779c97f7" containerName="collect-profiles" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.689519 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.708012 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4tj69"] Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.746575 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3861350-bc7d-4211-a4ba-fe76779c97f7-secret-volume\") pod \"c3861350-bc7d-4211-a4ba-fe76779c97f7\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.746819 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvlns\" (UniqueName: \"kubernetes.io/projected/c3861350-bc7d-4211-a4ba-fe76779c97f7-kube-api-access-mvlns\") pod \"c3861350-bc7d-4211-a4ba-fe76779c97f7\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.747515 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3861350-bc7d-4211-a4ba-fe76779c97f7-config-volume\") pod \"c3861350-bc7d-4211-a4ba-fe76779c97f7\" (UID: \"c3861350-bc7d-4211-a4ba-fe76779c97f7\") " Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.747890 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3861350-bc7d-4211-a4ba-fe76779c97f7-config-volume" (OuterVolumeSpecName: "config-volume") pod "c3861350-bc7d-4211-a4ba-fe76779c97f7" (UID: "c3861350-bc7d-4211-a4ba-fe76779c97f7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.748172 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3861350-bc7d-4211-a4ba-fe76779c97f7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.751493 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3861350-bc7d-4211-a4ba-fe76779c97f7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c3861350-bc7d-4211-a4ba-fe76779c97f7" (UID: "c3861350-bc7d-4211-a4ba-fe76779c97f7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.752641 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3861350-bc7d-4211-a4ba-fe76779c97f7-kube-api-access-mvlns" (OuterVolumeSpecName: "kube-api-access-mvlns") pod "c3861350-bc7d-4211-a4ba-fe76779c97f7" (UID: "c3861350-bc7d-4211-a4ba-fe76779c97f7"). InnerVolumeSpecName "kube-api-access-mvlns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.799780 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6f65-account-create-update-5gg5r"] Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.803232 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.805590 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.814091 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6f65-account-create-update-5gg5r"] Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.849523 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsz9s\" (UniqueName: \"kubernetes.io/projected/8aea6daf-6c69-4bac-aea8-5d946702ceb3-kube-api-access-wsz9s\") pod \"barbican-db-create-4tj69\" (UID: \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\") " pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.849637 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea6daf-6c69-4bac-aea8-5d946702ceb3-operator-scripts\") pod \"barbican-db-create-4tj69\" (UID: \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\") " pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.849716 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3861350-bc7d-4211-a4ba-fe76779c97f7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.849738 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvlns\" (UniqueName: \"kubernetes.io/projected/c3861350-bc7d-4211-a4ba-fe76779c97f7-kube-api-access-mvlns\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.950996 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea6daf-6c69-4bac-aea8-5d946702ceb3-operator-scripts\") pod \"barbican-db-create-4tj69\" (UID: \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\") " pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.951055 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9b6c\" (UniqueName: \"kubernetes.io/projected/920faed1-036c-4f9e-9bd0-758f76a7e4d9-kube-api-access-m9b6c\") pod \"barbican-6f65-account-create-update-5gg5r\" (UID: \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\") " pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.951087 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920faed1-036c-4f9e-9bd0-758f76a7e4d9-operator-scripts\") pod \"barbican-6f65-account-create-update-5gg5r\" (UID: \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\") " pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.951248 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsz9s\" (UniqueName: \"kubernetes.io/projected/8aea6daf-6c69-4bac-aea8-5d946702ceb3-kube-api-access-wsz9s\") pod \"barbican-db-create-4tj69\" (UID: \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\") " pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.952029 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea6daf-6c69-4bac-aea8-5d946702ceb3-operator-scripts\") pod \"barbican-db-create-4tj69\" (UID: \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\") " pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:03 crc kubenswrapper[4766]: I1209 04:45:03.969047 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsz9s\" (UniqueName: \"kubernetes.io/projected/8aea6daf-6c69-4bac-aea8-5d946702ceb3-kube-api-access-wsz9s\") pod \"barbican-db-create-4tj69\" (UID: \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\") " pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.007492 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.052813 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9b6c\" (UniqueName: \"kubernetes.io/projected/920faed1-036c-4f9e-9bd0-758f76a7e4d9-kube-api-access-m9b6c\") pod \"barbican-6f65-account-create-update-5gg5r\" (UID: \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\") " pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.052877 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920faed1-036c-4f9e-9bd0-758f76a7e4d9-operator-scripts\") pod \"barbican-6f65-account-create-update-5gg5r\" (UID: \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\") " pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.053847 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920faed1-036c-4f9e-9bd0-758f76a7e4d9-operator-scripts\") pod \"barbican-6f65-account-create-update-5gg5r\" (UID: \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\") " pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.068361 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9b6c\" (UniqueName: \"kubernetes.io/projected/920faed1-036c-4f9e-9bd0-758f76a7e4d9-kube-api-access-m9b6c\") pod \"barbican-6f65-account-create-update-5gg5r\" (UID: \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\") " pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.143084 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.278281 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" event={"ID":"c3861350-bc7d-4211-a4ba-fe76779c97f7","Type":"ContainerDied","Data":"5f9c5b5c2cebf1812e970d1102c0f8219c9fd8feaf9caa8546a84e7885c542e7"} Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.278328 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6" Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.278340 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f9c5b5c2cebf1812e970d1102c0f8219c9fd8feaf9caa8546a84e7885c542e7" Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.350722 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw"] Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.360549 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420880-pk6nw"] Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.464138 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4tj69"] Dec 09 04:45:04 crc kubenswrapper[4766]: W1209 04:45:04.465414 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aea6daf_6c69_4bac_aea8_5d946702ceb3.slice/crio-a6facb0918172b5051e2b21e95f6fbdc5b205d932e0ede0be8e27efd276cbe1d WatchSource:0}: Error finding container a6facb0918172b5051e2b21e95f6fbdc5b205d932e0ede0be8e27efd276cbe1d: Status 404 returned error can't find the container with id a6facb0918172b5051e2b21e95f6fbdc5b205d932e0ede0be8e27efd276cbe1d Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.625120 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6f65-account-create-update-5gg5r"] Dec 09 04:45:04 crc kubenswrapper[4766]: I1209 04:45:04.851167 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9" path="/var/lib/kubelet/pods/e474f6e1-79e8-41e7-a86f-c3eaf99d0cf9/volumes" Dec 09 04:45:05 crc kubenswrapper[4766]: I1209 04:45:05.289065 4766 generic.go:334] "Generic (PLEG): container finished" podID="920faed1-036c-4f9e-9bd0-758f76a7e4d9" containerID="dcafac24941a5badba1868ee31abe3f0876f1677ee59545d93b1da4708b145ef" exitCode=0 Dec 09 04:45:05 crc kubenswrapper[4766]: I1209 04:45:05.289192 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f65-account-create-update-5gg5r" event={"ID":"920faed1-036c-4f9e-9bd0-758f76a7e4d9","Type":"ContainerDied","Data":"dcafac24941a5badba1868ee31abe3f0876f1677ee59545d93b1da4708b145ef"} Dec 09 04:45:05 crc kubenswrapper[4766]: I1209 04:45:05.289258 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f65-account-create-update-5gg5r" event={"ID":"920faed1-036c-4f9e-9bd0-758f76a7e4d9","Type":"ContainerStarted","Data":"402cf22c841b8d8d33bdaf1df092e329e444061131bc32cca53dcb113dca3ff7"} Dec 09 04:45:05 crc kubenswrapper[4766]: I1209 04:45:05.291385 4766 generic.go:334] "Generic (PLEG): container finished" podID="8aea6daf-6c69-4bac-aea8-5d946702ceb3" containerID="24c523662388fabd2a0f22feb8e8fae84c2e10dab3bc8b1591d566ba131e094e" exitCode=0 Dec 09 04:45:05 crc kubenswrapper[4766]: I1209 04:45:05.291718 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4tj69" event={"ID":"8aea6daf-6c69-4bac-aea8-5d946702ceb3","Type":"ContainerDied","Data":"24c523662388fabd2a0f22feb8e8fae84c2e10dab3bc8b1591d566ba131e094e"} Dec 09 04:45:05 crc kubenswrapper[4766]: I1209 04:45:05.292038 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4tj69" event={"ID":"8aea6daf-6c69-4bac-aea8-5d946702ceb3","Type":"ContainerStarted","Data":"a6facb0918172b5051e2b21e95f6fbdc5b205d932e0ede0be8e27efd276cbe1d"} Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.689681 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.694659 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.801682 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920faed1-036c-4f9e-9bd0-758f76a7e4d9-operator-scripts\") pod \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\" (UID: \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\") " Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.801739 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea6daf-6c69-4bac-aea8-5d946702ceb3-operator-scripts\") pod \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\" (UID: \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\") " Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.802039 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsz9s\" (UniqueName: \"kubernetes.io/projected/8aea6daf-6c69-4bac-aea8-5d946702ceb3-kube-api-access-wsz9s\") pod \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\" (UID: \"8aea6daf-6c69-4bac-aea8-5d946702ceb3\") " Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.802089 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9b6c\" (UniqueName: \"kubernetes.io/projected/920faed1-036c-4f9e-9bd0-758f76a7e4d9-kube-api-access-m9b6c\") pod \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\" (UID: \"920faed1-036c-4f9e-9bd0-758f76a7e4d9\") " Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.802471 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920faed1-036c-4f9e-9bd0-758f76a7e4d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "920faed1-036c-4f9e-9bd0-758f76a7e4d9" (UID: "920faed1-036c-4f9e-9bd0-758f76a7e4d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.802577 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aea6daf-6c69-4bac-aea8-5d946702ceb3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8aea6daf-6c69-4bac-aea8-5d946702ceb3" (UID: "8aea6daf-6c69-4bac-aea8-5d946702ceb3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.802798 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/920faed1-036c-4f9e-9bd0-758f76a7e4d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.802818 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea6daf-6c69-4bac-aea8-5d946702ceb3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.806839 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/920faed1-036c-4f9e-9bd0-758f76a7e4d9-kube-api-access-m9b6c" (OuterVolumeSpecName: "kube-api-access-m9b6c") pod "920faed1-036c-4f9e-9bd0-758f76a7e4d9" (UID: "920faed1-036c-4f9e-9bd0-758f76a7e4d9"). InnerVolumeSpecName "kube-api-access-m9b6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.806953 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aea6daf-6c69-4bac-aea8-5d946702ceb3-kube-api-access-wsz9s" (OuterVolumeSpecName: "kube-api-access-wsz9s") pod "8aea6daf-6c69-4bac-aea8-5d946702ceb3" (UID: "8aea6daf-6c69-4bac-aea8-5d946702ceb3"). InnerVolumeSpecName "kube-api-access-wsz9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.904829 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsz9s\" (UniqueName: \"kubernetes.io/projected/8aea6daf-6c69-4bac-aea8-5d946702ceb3-kube-api-access-wsz9s\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:06 crc kubenswrapper[4766]: I1209 04:45:06.904867 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9b6c\" (UniqueName: \"kubernetes.io/projected/920faed1-036c-4f9e-9bd0-758f76a7e4d9-kube-api-access-m9b6c\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:07 crc kubenswrapper[4766]: I1209 04:45:07.317083 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f65-account-create-update-5gg5r" event={"ID":"920faed1-036c-4f9e-9bd0-758f76a7e4d9","Type":"ContainerDied","Data":"402cf22c841b8d8d33bdaf1df092e329e444061131bc32cca53dcb113dca3ff7"} Dec 09 04:45:07 crc kubenswrapper[4766]: I1209 04:45:07.317119 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f65-account-create-update-5gg5r" Dec 09 04:45:07 crc kubenswrapper[4766]: I1209 04:45:07.317152 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402cf22c841b8d8d33bdaf1df092e329e444061131bc32cca53dcb113dca3ff7" Dec 09 04:45:07 crc kubenswrapper[4766]: I1209 04:45:07.317338 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:45:07 crc kubenswrapper[4766]: I1209 04:45:07.317413 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:45:07 crc kubenswrapper[4766]: I1209 04:45:07.320310 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4tj69" event={"ID":"8aea6daf-6c69-4bac-aea8-5d946702ceb3","Type":"ContainerDied","Data":"a6facb0918172b5051e2b21e95f6fbdc5b205d932e0ede0be8e27efd276cbe1d"} Dec 09 04:45:07 crc kubenswrapper[4766]: I1209 04:45:07.320352 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6facb0918172b5051e2b21e95f6fbdc5b205d932e0ede0be8e27efd276cbe1d" Dec 09 04:45:07 crc kubenswrapper[4766]: I1209 04:45:07.320403 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4tj69" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.087334 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6pxf5"] Dec 09 04:45:09 crc kubenswrapper[4766]: E1209 04:45:09.087738 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aea6daf-6c69-4bac-aea8-5d946702ceb3" containerName="mariadb-database-create" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.087752 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aea6daf-6c69-4bac-aea8-5d946702ceb3" containerName="mariadb-database-create" Dec 09 04:45:09 crc kubenswrapper[4766]: E1209 04:45:09.087773 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920faed1-036c-4f9e-9bd0-758f76a7e4d9" containerName="mariadb-account-create-update" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.087781 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="920faed1-036c-4f9e-9bd0-758f76a7e4d9" containerName="mariadb-account-create-update" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.087988 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="920faed1-036c-4f9e-9bd0-758f76a7e4d9" containerName="mariadb-account-create-update" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.088010 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aea6daf-6c69-4bac-aea8-5d946702ceb3" containerName="mariadb-database-create" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.088646 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.092924 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.093089 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-td427" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.106426 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6pxf5"] Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.246653 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-db-sync-config-data\") pod \"barbican-db-sync-6pxf5\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.246978 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n569v\" (UniqueName: \"kubernetes.io/projected/a388e942-d085-4f67-8f6c-c04eedc01a84-kube-api-access-n569v\") pod \"barbican-db-sync-6pxf5\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.247023 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-combined-ca-bundle\") pod \"barbican-db-sync-6pxf5\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.348678 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-db-sync-config-data\") pod \"barbican-db-sync-6pxf5\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.348759 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n569v\" (UniqueName: \"kubernetes.io/projected/a388e942-d085-4f67-8f6c-c04eedc01a84-kube-api-access-n569v\") pod \"barbican-db-sync-6pxf5\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.348799 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-combined-ca-bundle\") pod \"barbican-db-sync-6pxf5\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.355227 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-combined-ca-bundle\") pod \"barbican-db-sync-6pxf5\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.366566 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-db-sync-config-data\") pod \"barbican-db-sync-6pxf5\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.372199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n569v\" (UniqueName: \"kubernetes.io/projected/a388e942-d085-4f67-8f6c-c04eedc01a84-kube-api-access-n569v\") pod \"barbican-db-sync-6pxf5\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.414093 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:09 crc kubenswrapper[4766]: I1209 04:45:09.853348 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6pxf5"] Dec 09 04:45:10 crc kubenswrapper[4766]: I1209 04:45:10.346873 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6pxf5" event={"ID":"a388e942-d085-4f67-8f6c-c04eedc01a84","Type":"ContainerStarted","Data":"85a1468ad69504b403522a61ef2fa122d3a4e2f277511a377d0384e6df158599"} Dec 09 04:45:10 crc kubenswrapper[4766]: I1209 04:45:10.346943 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6pxf5" event={"ID":"a388e942-d085-4f67-8f6c-c04eedc01a84","Type":"ContainerStarted","Data":"e1ce61775c971fa3a3534b3f5caf28c2cfc9b5eaff8ab577bbd68d90bfc49e19"} Dec 09 04:45:10 crc kubenswrapper[4766]: I1209 04:45:10.372867 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6pxf5" podStartSLOduration=1.372834919 podStartE2EDuration="1.372834919s" podCreationTimestamp="2025-12-09 04:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:10.367478114 +0000 UTC m=+5592.076783580" watchObservedRunningTime="2025-12-09 04:45:10.372834919 +0000 UTC m=+5592.082140385" Dec 09 04:45:11 crc kubenswrapper[4766]: I1209 04:45:11.360703 4766 generic.go:334] "Generic (PLEG): container finished" podID="a388e942-d085-4f67-8f6c-c04eedc01a84" containerID="85a1468ad69504b403522a61ef2fa122d3a4e2f277511a377d0384e6df158599" exitCode=0 Dec 09 04:45:11 crc kubenswrapper[4766]: I1209 04:45:11.360762 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6pxf5" event={"ID":"a388e942-d085-4f67-8f6c-c04eedc01a84","Type":"ContainerDied","Data":"85a1468ad69504b403522a61ef2fa122d3a4e2f277511a377d0384e6df158599"} Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.702003 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.804741 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-db-sync-config-data\") pod \"a388e942-d085-4f67-8f6c-c04eedc01a84\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.805534 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n569v\" (UniqueName: \"kubernetes.io/projected/a388e942-d085-4f67-8f6c-c04eedc01a84-kube-api-access-n569v\") pod \"a388e942-d085-4f67-8f6c-c04eedc01a84\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.805619 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-combined-ca-bundle\") pod \"a388e942-d085-4f67-8f6c-c04eedc01a84\" (UID: \"a388e942-d085-4f67-8f6c-c04eedc01a84\") " Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.811146 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a388e942-d085-4f67-8f6c-c04eedc01a84-kube-api-access-n569v" (OuterVolumeSpecName: "kube-api-access-n569v") pod "a388e942-d085-4f67-8f6c-c04eedc01a84" (UID: "a388e942-d085-4f67-8f6c-c04eedc01a84"). InnerVolumeSpecName "kube-api-access-n569v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.819559 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a388e942-d085-4f67-8f6c-c04eedc01a84" (UID: "a388e942-d085-4f67-8f6c-c04eedc01a84"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.842149 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a388e942-d085-4f67-8f6c-c04eedc01a84" (UID: "a388e942-d085-4f67-8f6c-c04eedc01a84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.908673 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n569v\" (UniqueName: \"kubernetes.io/projected/a388e942-d085-4f67-8f6c-c04eedc01a84-kube-api-access-n569v\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.908718 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:12 crc kubenswrapper[4766]: I1209 04:45:12.908738 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a388e942-d085-4f67-8f6c-c04eedc01a84-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.380528 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6pxf5" event={"ID":"a388e942-d085-4f67-8f6c-c04eedc01a84","Type":"ContainerDied","Data":"e1ce61775c971fa3a3534b3f5caf28c2cfc9b5eaff8ab577bbd68d90bfc49e19"} Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.380574 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ce61775c971fa3a3534b3f5caf28c2cfc9b5eaff8ab577bbd68d90bfc49e19" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.380592 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6pxf5" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.667703 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-54f9c8697f-tdwlg"] Dec 09 04:45:13 crc kubenswrapper[4766]: E1209 04:45:13.668083 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a388e942-d085-4f67-8f6c-c04eedc01a84" containerName="barbican-db-sync" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.668100 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a388e942-d085-4f67-8f6c-c04eedc01a84" containerName="barbican-db-sync" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.668270 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a388e942-d085-4f67-8f6c-c04eedc01a84" containerName="barbican-db-sync" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.669168 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.673986 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.675471 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-td427" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.675628 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.694360 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54f9c8697f-tdwlg"] Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.812191 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-85b957594-kjrbm"] Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.814275 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.823140 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.825290 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cece7620-62bd-44f9-8fad-c527e39ab2ee-logs\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.825397 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6smh\" (UniqueName: \"kubernetes.io/projected/cece7620-62bd-44f9-8fad-c527e39ab2ee-kube-api-access-g6smh\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.825444 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cece7620-62bd-44f9-8fad-c527e39ab2ee-config-data-custom\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.825474 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cece7620-62bd-44f9-8fad-c527e39ab2ee-config-data\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.825638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cece7620-62bd-44f9-8fad-c527e39ab2ee-combined-ca-bundle\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.841069 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-85b957594-kjrbm"] Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.891290 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-557b9c7599-xg6p4"] Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.892652 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.922647 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557b9c7599-xg6p4"] Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927285 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cece7620-62bd-44f9-8fad-c527e39ab2ee-logs\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927355 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvkqb\" (UniqueName: \"kubernetes.io/projected/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-kube-api-access-vvkqb\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927405 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-combined-ca-bundle\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927433 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-config-data-custom\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927464 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-config-data\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927511 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6smh\" (UniqueName: \"kubernetes.io/projected/cece7620-62bd-44f9-8fad-c527e39ab2ee-kube-api-access-g6smh\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927570 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cece7620-62bd-44f9-8fad-c527e39ab2ee-config-data-custom\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927597 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-logs\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927621 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cece7620-62bd-44f9-8fad-c527e39ab2ee-config-data\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.927698 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cece7620-62bd-44f9-8fad-c527e39ab2ee-combined-ca-bundle\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.929623 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cece7620-62bd-44f9-8fad-c527e39ab2ee-logs\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.932956 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cece7620-62bd-44f9-8fad-c527e39ab2ee-config-data\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.939012 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cece7620-62bd-44f9-8fad-c527e39ab2ee-combined-ca-bundle\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.948084 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cece7620-62bd-44f9-8fad-c527e39ab2ee-config-data-custom\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.979208 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6smh\" (UniqueName: \"kubernetes.io/projected/cece7620-62bd-44f9-8fad-c527e39ab2ee-kube-api-access-g6smh\") pod \"barbican-worker-54f9c8697f-tdwlg\" (UID: \"cece7620-62bd-44f9-8fad-c527e39ab2ee\") " pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:13 crc kubenswrapper[4766]: I1209 04:45:13.987607 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54f9c8697f-tdwlg" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.028720 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvkqb\" (UniqueName: \"kubernetes.io/projected/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-kube-api-access-vvkqb\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.028773 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-nb\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.028802 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-combined-ca-bundle\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.028823 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-config-data-custom\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.028846 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-config-data\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.028890 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-dns-svc\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.028912 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-logs\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.028954 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqslz\" (UniqueName: \"kubernetes.io/projected/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-kube-api-access-zqslz\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.028982 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-config\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.029011 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-sb\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.041420 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-logs\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.041543 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-config-data-custom\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.048332 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-config-data\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.052148 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-combined-ca-bundle\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.055969 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvkqb\" (UniqueName: \"kubernetes.io/projected/f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe-kube-api-access-vvkqb\") pod \"barbican-keystone-listener-85b957594-kjrbm\" (UID: \"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe\") " pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.066468 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7457696574-f27s8"] Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.068032 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.069841 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.082278 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7457696574-f27s8"] Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.130165 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-config\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.134458 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-sb\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.134716 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-nb\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.132279 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-config\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.134929 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-dns-svc\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.135108 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqslz\" (UniqueName: \"kubernetes.io/projected/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-kube-api-access-zqslz\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.135675 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-sb\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.136394 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-dns-svc\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.136774 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-nb\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.144463 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-85b957594-kjrbm" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.151192 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqslz\" (UniqueName: \"kubernetes.io/projected/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-kube-api-access-zqslz\") pod \"dnsmasq-dns-557b9c7599-xg6p4\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.162351 4766 scope.go:117] "RemoveContainer" containerID="8c006bbf0c9b96874dda4aed6fae029bc9865cd0d6ac71130032fa7525335a3b" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.227581 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.236924 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2538afb4-ca93-4611-aa92-034f134c476d-config-data-custom\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.237060 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538afb4-ca93-4611-aa92-034f134c476d-config-data\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.237124 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2538afb4-ca93-4611-aa92-034f134c476d-logs\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.237173 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq6vt\" (UniqueName: \"kubernetes.io/projected/2538afb4-ca93-4611-aa92-034f134c476d-kube-api-access-wq6vt\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.237201 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538afb4-ca93-4611-aa92-034f134c476d-combined-ca-bundle\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.339453 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538afb4-ca93-4611-aa92-034f134c476d-combined-ca-bundle\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.339539 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2538afb4-ca93-4611-aa92-034f134c476d-config-data-custom\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.339606 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538afb4-ca93-4611-aa92-034f134c476d-config-data\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.339638 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2538afb4-ca93-4611-aa92-034f134c476d-logs\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.339667 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq6vt\" (UniqueName: \"kubernetes.io/projected/2538afb4-ca93-4611-aa92-034f134c476d-kube-api-access-wq6vt\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.340260 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2538afb4-ca93-4611-aa92-034f134c476d-logs\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.346800 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2538afb4-ca93-4611-aa92-034f134c476d-config-data-custom\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.346949 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538afb4-ca93-4611-aa92-034f134c476d-combined-ca-bundle\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.347140 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538afb4-ca93-4611-aa92-034f134c476d-config-data\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.365596 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq6vt\" (UniqueName: \"kubernetes.io/projected/2538afb4-ca93-4611-aa92-034f134c476d-kube-api-access-wq6vt\") pod \"barbican-api-7457696574-f27s8\" (UID: \"2538afb4-ca93-4611-aa92-034f134c476d\") " pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.446615 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.529939 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54f9c8697f-tdwlg"] Dec 09 04:45:14 crc kubenswrapper[4766]: W1209 04:45:14.633647 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6a8b1e1_4fe0_426a_acf7_9d74a1271dbe.slice/crio-b79c137aa6d304c47ccd93bba71de741a7c28e3a4f7de906000dcbaa90d6ccfb WatchSource:0}: Error finding container b79c137aa6d304c47ccd93bba71de741a7c28e3a4f7de906000dcbaa90d6ccfb: Status 404 returned error can't find the container with id b79c137aa6d304c47ccd93bba71de741a7c28e3a4f7de906000dcbaa90d6ccfb Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.636771 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-85b957594-kjrbm"] Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.717683 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557b9c7599-xg6p4"] Dec 09 04:45:14 crc kubenswrapper[4766]: I1209 04:45:14.896955 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7457696574-f27s8"] Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.400206 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-85b957594-kjrbm" event={"ID":"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe","Type":"ContainerStarted","Data":"ddd61529fa278a89a5a46745f77cfb54b795c2f7e8c9a09a5ace1e5c7fbaa61f"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.400484 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-85b957594-kjrbm" event={"ID":"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe","Type":"ContainerStarted","Data":"0d14ec28914bb3614b5241c8891d6181b72787021fefe4a4fcff62fbd05e8fca"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.400493 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-85b957594-kjrbm" event={"ID":"f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe","Type":"ContainerStarted","Data":"b79c137aa6d304c47ccd93bba71de741a7c28e3a4f7de906000dcbaa90d6ccfb"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.401871 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54f9c8697f-tdwlg" event={"ID":"cece7620-62bd-44f9-8fad-c527e39ab2ee","Type":"ContainerStarted","Data":"e39fdff8b7c61bf2c88feac0db2c8e701e99078328a7e25bcd39f3642f94eec1"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.401898 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54f9c8697f-tdwlg" event={"ID":"cece7620-62bd-44f9-8fad-c527e39ab2ee","Type":"ContainerStarted","Data":"91753126b443c227d3363b071b714b55fdceed451c361176d30d1cd47a6e377b"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.401910 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54f9c8697f-tdwlg" event={"ID":"cece7620-62bd-44f9-8fad-c527e39ab2ee","Type":"ContainerStarted","Data":"44043b36b849b9c291e7335d612d960f76d63bd14d8a4c0d49f007ef1831d1bb"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.408143 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7457696574-f27s8" event={"ID":"2538afb4-ca93-4611-aa92-034f134c476d","Type":"ContainerStarted","Data":"7c39be6f3767516e30d9db612dfcd30f75173549d48ad5d3a85d2b61f75e0c22"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.408201 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7457696574-f27s8" event={"ID":"2538afb4-ca93-4611-aa92-034f134c476d","Type":"ContainerStarted","Data":"c8c5cf2402ed8fd018cc1b3b6d87f1a79d52ffd63d01810bd99bf5432f24a806"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.415627 4766 generic.go:334] "Generic (PLEG): container finished" podID="4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" containerID="5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee" exitCode=0 Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.415905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" event={"ID":"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99","Type":"ContainerDied","Data":"5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.415939 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" event={"ID":"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99","Type":"ContainerStarted","Data":"bbc47e436e015b6d9ff9dd451346e3acc703f9ee1b3ba921c22b2e045a1ff1b7"} Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.418812 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-85b957594-kjrbm" podStartSLOduration=2.418791716 podStartE2EDuration="2.418791716s" podCreationTimestamp="2025-12-09 04:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:15.416898925 +0000 UTC m=+5597.126204361" watchObservedRunningTime="2025-12-09 04:45:15.418791716 +0000 UTC m=+5597.128097182" Dec 09 04:45:15 crc kubenswrapper[4766]: I1209 04:45:15.453564 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-54f9c8697f-tdwlg" podStartSLOduration=2.453537566 podStartE2EDuration="2.453537566s" podCreationTimestamp="2025-12-09 04:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:15.445503599 +0000 UTC m=+5597.154809035" watchObservedRunningTime="2025-12-09 04:45:15.453537566 +0000 UTC m=+5597.162843022" Dec 09 04:45:16 crc kubenswrapper[4766]: I1209 04:45:16.424484 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7457696574-f27s8" event={"ID":"2538afb4-ca93-4611-aa92-034f134c476d","Type":"ContainerStarted","Data":"bcf4ca4d1226f7f17dc97e601bc519aeb9f54cc2acca53b729858514d548a5e6"} Dec 09 04:45:16 crc kubenswrapper[4766]: I1209 04:45:16.424956 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:16 crc kubenswrapper[4766]: I1209 04:45:16.425110 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:16 crc kubenswrapper[4766]: I1209 04:45:16.426596 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" event={"ID":"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99","Type":"ContainerStarted","Data":"55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263"} Dec 09 04:45:16 crc kubenswrapper[4766]: I1209 04:45:16.441327 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7457696574-f27s8" podStartSLOduration=2.44131092 podStartE2EDuration="2.44131092s" podCreationTimestamp="2025-12-09 04:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:16.440380455 +0000 UTC m=+5598.149685891" watchObservedRunningTime="2025-12-09 04:45:16.44131092 +0000 UTC m=+5598.150616346" Dec 09 04:45:16 crc kubenswrapper[4766]: I1209 04:45:16.466691 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" podStartSLOduration=3.466675275 podStartE2EDuration="3.466675275s" podCreationTimestamp="2025-12-09 04:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:16.462702118 +0000 UTC m=+5598.172007544" watchObservedRunningTime="2025-12-09 04:45:16.466675275 +0000 UTC m=+5598.175980701" Dec 09 04:45:17 crc kubenswrapper[4766]: I1209 04:45:17.446950 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:21 crc kubenswrapper[4766]: I1209 04:45:21.994125 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2qfbz"] Dec 09 04:45:21 crc kubenswrapper[4766]: I1209 04:45:21.998900 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.007270 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qfbz"] Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.083963 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-utilities\") pod \"certified-operators-2qfbz\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.084036 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttmn7\" (UniqueName: \"kubernetes.io/projected/200ba46b-02de-411d-96dc-4dec32c5330a-kube-api-access-ttmn7\") pod \"certified-operators-2qfbz\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.084107 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-catalog-content\") pod \"certified-operators-2qfbz\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.186511 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-utilities\") pod \"certified-operators-2qfbz\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.186616 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttmn7\" (UniqueName: \"kubernetes.io/projected/200ba46b-02de-411d-96dc-4dec32c5330a-kube-api-access-ttmn7\") pod \"certified-operators-2qfbz\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.186657 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-catalog-content\") pod \"certified-operators-2qfbz\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.187461 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-utilities\") pod \"certified-operators-2qfbz\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.187740 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-catalog-content\") pod \"certified-operators-2qfbz\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.224959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttmn7\" (UniqueName: \"kubernetes.io/projected/200ba46b-02de-411d-96dc-4dec32c5330a-kube-api-access-ttmn7\") pod \"certified-operators-2qfbz\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.359931 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:22 crc kubenswrapper[4766]: I1209 04:45:22.807841 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qfbz"] Dec 09 04:45:23 crc kubenswrapper[4766]: I1209 04:45:23.510734 4766 generic.go:334] "Generic (PLEG): container finished" podID="200ba46b-02de-411d-96dc-4dec32c5330a" containerID="f577513496fc8292f579964a07b701d2fc0eff18885297bcfe522d0f2edbfddb" exitCode=0 Dec 09 04:45:23 crc kubenswrapper[4766]: I1209 04:45:23.510845 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qfbz" event={"ID":"200ba46b-02de-411d-96dc-4dec32c5330a","Type":"ContainerDied","Data":"f577513496fc8292f579964a07b701d2fc0eff18885297bcfe522d0f2edbfddb"} Dec 09 04:45:23 crc kubenswrapper[4766]: I1209 04:45:23.511286 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qfbz" event={"ID":"200ba46b-02de-411d-96dc-4dec32c5330a","Type":"ContainerStarted","Data":"aee5fa396cdf0bc2afa8a28f053069bf4735083459a6ded852d9ce655e519dc4"} Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.230450 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.316979 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-575fbf448f-f4bvn"] Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.317381 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" podUID="f63e3801-5ca0-48dd-9125-dded30ae25d9" containerName="dnsmasq-dns" containerID="cri-o://93c07e6ecea6847f1bd0d9e1783b135bb19ca1a993f27aa8a91ba0121e4714cf" gracePeriod=10 Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.553896 4766 generic.go:334] "Generic (PLEG): container finished" podID="f63e3801-5ca0-48dd-9125-dded30ae25d9" containerID="93c07e6ecea6847f1bd0d9e1783b135bb19ca1a993f27aa8a91ba0121e4714cf" exitCode=0 Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.553947 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" event={"ID":"f63e3801-5ca0-48dd-9125-dded30ae25d9","Type":"ContainerDied","Data":"93c07e6ecea6847f1bd0d9e1783b135bb19ca1a993f27aa8a91ba0121e4714cf"} Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.809710 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.944625 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwq8t\" (UniqueName: \"kubernetes.io/projected/f63e3801-5ca0-48dd-9125-dded30ae25d9-kube-api-access-hwq8t\") pod \"f63e3801-5ca0-48dd-9125-dded30ae25d9\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.944749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-dns-svc\") pod \"f63e3801-5ca0-48dd-9125-dded30ae25d9\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.944807 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-config\") pod \"f63e3801-5ca0-48dd-9125-dded30ae25d9\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.944833 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-sb\") pod \"f63e3801-5ca0-48dd-9125-dded30ae25d9\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.944910 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-nb\") pod \"f63e3801-5ca0-48dd-9125-dded30ae25d9\" (UID: \"f63e3801-5ca0-48dd-9125-dded30ae25d9\") " Dec 09 04:45:24 crc kubenswrapper[4766]: I1209 04:45:24.954014 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63e3801-5ca0-48dd-9125-dded30ae25d9-kube-api-access-hwq8t" (OuterVolumeSpecName: "kube-api-access-hwq8t") pod "f63e3801-5ca0-48dd-9125-dded30ae25d9" (UID: "f63e3801-5ca0-48dd-9125-dded30ae25d9"). InnerVolumeSpecName "kube-api-access-hwq8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.000649 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f63e3801-5ca0-48dd-9125-dded30ae25d9" (UID: "f63e3801-5ca0-48dd-9125-dded30ae25d9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.013508 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f63e3801-5ca0-48dd-9125-dded30ae25d9" (UID: "f63e3801-5ca0-48dd-9125-dded30ae25d9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.025881 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-config" (OuterVolumeSpecName: "config") pod "f63e3801-5ca0-48dd-9125-dded30ae25d9" (UID: "f63e3801-5ca0-48dd-9125-dded30ae25d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.027391 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f63e3801-5ca0-48dd-9125-dded30ae25d9" (UID: "f63e3801-5ca0-48dd-9125-dded30ae25d9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.047374 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.047420 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwq8t\" (UniqueName: \"kubernetes.io/projected/f63e3801-5ca0-48dd-9125-dded30ae25d9-kube-api-access-hwq8t\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.047435 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.047448 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.047460 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f63e3801-5ca0-48dd-9125-dded30ae25d9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.570753 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.570763 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-575fbf448f-f4bvn" event={"ID":"f63e3801-5ca0-48dd-9125-dded30ae25d9","Type":"ContainerDied","Data":"19497619c6761f0e69093be05d217cef62f08d4d529aa04197b0439cc905f819"} Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.570821 4766 scope.go:117] "RemoveContainer" containerID="93c07e6ecea6847f1bd0d9e1783b135bb19ca1a993f27aa8a91ba0121e4714cf" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.576630 4766 generic.go:334] "Generic (PLEG): container finished" podID="200ba46b-02de-411d-96dc-4dec32c5330a" containerID="09134fe6ebc545ebe7f58915dae51d4711f8f66cfd567f96bfb070631a44293c" exitCode=0 Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.576711 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qfbz" event={"ID":"200ba46b-02de-411d-96dc-4dec32c5330a","Type":"ContainerDied","Data":"09134fe6ebc545ebe7f58915dae51d4711f8f66cfd567f96bfb070631a44293c"} Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.590448 4766 scope.go:117] "RemoveContainer" containerID="ea22423636339ab98b985864e440da4ad1c97150b5be5d864a72bd3360533a62" Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.622455 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-575fbf448f-f4bvn"] Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.631561 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-575fbf448f-f4bvn"] Dec 09 04:45:25 crc kubenswrapper[4766]: I1209 04:45:25.928056 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:26 crc kubenswrapper[4766]: I1209 04:45:26.020886 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7457696574-f27s8" Dec 09 04:45:26 crc kubenswrapper[4766]: I1209 04:45:26.592571 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qfbz" event={"ID":"200ba46b-02de-411d-96dc-4dec32c5330a","Type":"ContainerStarted","Data":"471a4906e9207607dfb3dbbd1e235a3ab91b87c2c6093aa97d8642f3dc8a6631"} Dec 09 04:45:26 crc kubenswrapper[4766]: I1209 04:45:26.619397 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2qfbz" podStartSLOduration=3.153344664 podStartE2EDuration="5.619373633s" podCreationTimestamp="2025-12-09 04:45:21 +0000 UTC" firstStartedPulling="2025-12-09 04:45:23.512969162 +0000 UTC m=+5605.222274628" lastFinishedPulling="2025-12-09 04:45:25.978998131 +0000 UTC m=+5607.688303597" observedRunningTime="2025-12-09 04:45:26.616536067 +0000 UTC m=+5608.325841493" watchObservedRunningTime="2025-12-09 04:45:26.619373633 +0000 UTC m=+5608.328679059" Dec 09 04:45:26 crc kubenswrapper[4766]: I1209 04:45:26.877299 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63e3801-5ca0-48dd-9125-dded30ae25d9" path="/var/lib/kubelet/pods/f63e3801-5ca0-48dd-9125-dded30ae25d9/volumes" Dec 09 04:45:32 crc kubenswrapper[4766]: I1209 04:45:32.360030 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:32 crc kubenswrapper[4766]: I1209 04:45:32.360671 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:32 crc kubenswrapper[4766]: I1209 04:45:32.405051 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:32 crc kubenswrapper[4766]: I1209 04:45:32.695401 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:32 crc kubenswrapper[4766]: I1209 04:45:32.757673 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2qfbz"] Dec 09 04:45:34 crc kubenswrapper[4766]: I1209 04:45:34.663501 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2qfbz" podUID="200ba46b-02de-411d-96dc-4dec32c5330a" containerName="registry-server" containerID="cri-o://471a4906e9207607dfb3dbbd1e235a3ab91b87c2c6093aa97d8642f3dc8a6631" gracePeriod=2 Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.673458 4766 generic.go:334] "Generic (PLEG): container finished" podID="200ba46b-02de-411d-96dc-4dec32c5330a" containerID="471a4906e9207607dfb3dbbd1e235a3ab91b87c2c6093aa97d8642f3dc8a6631" exitCode=0 Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.673554 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qfbz" event={"ID":"200ba46b-02de-411d-96dc-4dec32c5330a","Type":"ContainerDied","Data":"471a4906e9207607dfb3dbbd1e235a3ab91b87c2c6093aa97d8642f3dc8a6631"} Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.673898 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qfbz" event={"ID":"200ba46b-02de-411d-96dc-4dec32c5330a","Type":"ContainerDied","Data":"aee5fa396cdf0bc2afa8a28f053069bf4735083459a6ded852d9ce655e519dc4"} Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.673914 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aee5fa396cdf0bc2afa8a28f053069bf4735083459a6ded852d9ce655e519dc4" Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.679452 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.750399 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-utilities\") pod \"200ba46b-02de-411d-96dc-4dec32c5330a\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.750454 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttmn7\" (UniqueName: \"kubernetes.io/projected/200ba46b-02de-411d-96dc-4dec32c5330a-kube-api-access-ttmn7\") pod \"200ba46b-02de-411d-96dc-4dec32c5330a\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.750501 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-catalog-content\") pod \"200ba46b-02de-411d-96dc-4dec32c5330a\" (UID: \"200ba46b-02de-411d-96dc-4dec32c5330a\") " Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.751476 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-utilities" (OuterVolumeSpecName: "utilities") pod "200ba46b-02de-411d-96dc-4dec32c5330a" (UID: "200ba46b-02de-411d-96dc-4dec32c5330a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.758095 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200ba46b-02de-411d-96dc-4dec32c5330a-kube-api-access-ttmn7" (OuterVolumeSpecName: "kube-api-access-ttmn7") pod "200ba46b-02de-411d-96dc-4dec32c5330a" (UID: "200ba46b-02de-411d-96dc-4dec32c5330a"). InnerVolumeSpecName "kube-api-access-ttmn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.792857 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "200ba46b-02de-411d-96dc-4dec32c5330a" (UID: "200ba46b-02de-411d-96dc-4dec32c5330a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.852738 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.852782 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttmn7\" (UniqueName: \"kubernetes.io/projected/200ba46b-02de-411d-96dc-4dec32c5330a-kube-api-access-ttmn7\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:35 crc kubenswrapper[4766]: I1209 04:45:35.852799 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/200ba46b-02de-411d-96dc-4dec32c5330a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:36 crc kubenswrapper[4766]: I1209 04:45:36.680657 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qfbz" Dec 09 04:45:36 crc kubenswrapper[4766]: I1209 04:45:36.719706 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2qfbz"] Dec 09 04:45:36 crc kubenswrapper[4766]: I1209 04:45:36.730726 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2qfbz"] Dec 09 04:45:36 crc kubenswrapper[4766]: I1209 04:45:36.860763 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="200ba46b-02de-411d-96dc-4dec32c5330a" path="/var/lib/kubelet/pods/200ba46b-02de-411d-96dc-4dec32c5330a/volumes" Dec 09 04:45:37 crc kubenswrapper[4766]: I1209 04:45:37.316086 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:45:37 crc kubenswrapper[4766]: I1209 04:45:37.316160 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.619659 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xbw7w"] Dec 09 04:45:38 crc kubenswrapper[4766]: E1209 04:45:38.620238 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63e3801-5ca0-48dd-9125-dded30ae25d9" containerName="init" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.620257 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63e3801-5ca0-48dd-9125-dded30ae25d9" containerName="init" Dec 09 04:45:38 crc kubenswrapper[4766]: E1209 04:45:38.620274 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200ba46b-02de-411d-96dc-4dec32c5330a" containerName="extract-content" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.620280 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="200ba46b-02de-411d-96dc-4dec32c5330a" containerName="extract-content" Dec 09 04:45:38 crc kubenswrapper[4766]: E1209 04:45:38.620293 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200ba46b-02de-411d-96dc-4dec32c5330a" containerName="registry-server" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.620299 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="200ba46b-02de-411d-96dc-4dec32c5330a" containerName="registry-server" Dec 09 04:45:38 crc kubenswrapper[4766]: E1209 04:45:38.620311 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200ba46b-02de-411d-96dc-4dec32c5330a" containerName="extract-utilities" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.620319 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="200ba46b-02de-411d-96dc-4dec32c5330a" containerName="extract-utilities" Dec 09 04:45:38 crc kubenswrapper[4766]: E1209 04:45:38.620329 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63e3801-5ca0-48dd-9125-dded30ae25d9" containerName="dnsmasq-dns" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.620334 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63e3801-5ca0-48dd-9125-dded30ae25d9" containerName="dnsmasq-dns" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.621063 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63e3801-5ca0-48dd-9125-dded30ae25d9" containerName="dnsmasq-dns" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.621089 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="200ba46b-02de-411d-96dc-4dec32c5330a" containerName="registry-server" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.621762 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.630273 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xbw7w"] Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.702039 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-operator-scripts\") pod \"neutron-db-create-xbw7w\" (UID: \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\") " pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.702411 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-kube-api-access-4mt88\") pod \"neutron-db-create-xbw7w\" (UID: \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\") " pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.728544 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c01d-account-create-update-jxlnd"] Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.729847 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.731587 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.739075 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c01d-account-create-update-jxlnd"] Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.804801 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/713b0c31-e7ca-4673-90c1-f90879edd2ec-operator-scripts\") pod \"neutron-c01d-account-create-update-jxlnd\" (UID: \"713b0c31-e7ca-4673-90c1-f90879edd2ec\") " pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.804891 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-kube-api-access-4mt88\") pod \"neutron-db-create-xbw7w\" (UID: \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\") " pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.804935 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdnlj\" (UniqueName: \"kubernetes.io/projected/713b0c31-e7ca-4673-90c1-f90879edd2ec-kube-api-access-rdnlj\") pod \"neutron-c01d-account-create-update-jxlnd\" (UID: \"713b0c31-e7ca-4673-90c1-f90879edd2ec\") " pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.804954 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-operator-scripts\") pod \"neutron-db-create-xbw7w\" (UID: \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\") " pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.805784 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-operator-scripts\") pod \"neutron-db-create-xbw7w\" (UID: \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\") " pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.824402 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-kube-api-access-4mt88\") pod \"neutron-db-create-xbw7w\" (UID: \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\") " pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.909484 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/713b0c31-e7ca-4673-90c1-f90879edd2ec-operator-scripts\") pod \"neutron-c01d-account-create-update-jxlnd\" (UID: \"713b0c31-e7ca-4673-90c1-f90879edd2ec\") " pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.909675 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdnlj\" (UniqueName: \"kubernetes.io/projected/713b0c31-e7ca-4673-90c1-f90879edd2ec-kube-api-access-rdnlj\") pod \"neutron-c01d-account-create-update-jxlnd\" (UID: \"713b0c31-e7ca-4673-90c1-f90879edd2ec\") " pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.910265 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/713b0c31-e7ca-4673-90c1-f90879edd2ec-operator-scripts\") pod \"neutron-c01d-account-create-update-jxlnd\" (UID: \"713b0c31-e7ca-4673-90c1-f90879edd2ec\") " pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.929146 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdnlj\" (UniqueName: \"kubernetes.io/projected/713b0c31-e7ca-4673-90c1-f90879edd2ec-kube-api-access-rdnlj\") pod \"neutron-c01d-account-create-update-jxlnd\" (UID: \"713b0c31-e7ca-4673-90c1-f90879edd2ec\") " pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:38 crc kubenswrapper[4766]: I1209 04:45:38.938911 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:39 crc kubenswrapper[4766]: I1209 04:45:39.051479 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:39 crc kubenswrapper[4766]: I1209 04:45:39.436031 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xbw7w"] Dec 09 04:45:39 crc kubenswrapper[4766]: W1209 04:45:39.524967 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod713b0c31_e7ca_4673_90c1_f90879edd2ec.slice/crio-5eac6514767f6c2a1f235a581f4794996d9e048296d2ff38ba53e8127976f911 WatchSource:0}: Error finding container 5eac6514767f6c2a1f235a581f4794996d9e048296d2ff38ba53e8127976f911: Status 404 returned error can't find the container with id 5eac6514767f6c2a1f235a581f4794996d9e048296d2ff38ba53e8127976f911 Dec 09 04:45:39 crc kubenswrapper[4766]: I1209 04:45:39.531336 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c01d-account-create-update-jxlnd"] Dec 09 04:45:39 crc kubenswrapper[4766]: I1209 04:45:39.711972 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbw7w" event={"ID":"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb","Type":"ContainerStarted","Data":"57fccd2c1e9af4b151544b1817b148ea4fb4c4d690f084d9d99a822f0d9a29c3"} Dec 09 04:45:39 crc kubenswrapper[4766]: I1209 04:45:39.712029 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbw7w" event={"ID":"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb","Type":"ContainerStarted","Data":"7df925264383346b7088dc2aec83fc1e10da92f2e42b5d31ea850b1868841da5"} Dec 09 04:45:39 crc kubenswrapper[4766]: I1209 04:45:39.716984 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c01d-account-create-update-jxlnd" event={"ID":"713b0c31-e7ca-4673-90c1-f90879edd2ec","Type":"ContainerStarted","Data":"5a43adee1b5ea3ea7581e50e83c5ea968a364bd4ecd1692a4c77761a61106755"} Dec 09 04:45:39 crc kubenswrapper[4766]: I1209 04:45:39.717088 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c01d-account-create-update-jxlnd" event={"ID":"713b0c31-e7ca-4673-90c1-f90879edd2ec","Type":"ContainerStarted","Data":"5eac6514767f6c2a1f235a581f4794996d9e048296d2ff38ba53e8127976f911"} Dec 09 04:45:39 crc kubenswrapper[4766]: I1209 04:45:39.726058 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-xbw7w" podStartSLOduration=1.7260455810000002 podStartE2EDuration="1.726045581s" podCreationTimestamp="2025-12-09 04:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:39.724617333 +0000 UTC m=+5621.433922759" watchObservedRunningTime="2025-12-09 04:45:39.726045581 +0000 UTC m=+5621.435351007" Dec 09 04:45:39 crc kubenswrapper[4766]: I1209 04:45:39.743483 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c01d-account-create-update-jxlnd" podStartSLOduration=1.7434602620000001 podStartE2EDuration="1.743460262s" podCreationTimestamp="2025-12-09 04:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:39.740567174 +0000 UTC m=+5621.449872630" watchObservedRunningTime="2025-12-09 04:45:39.743460262 +0000 UTC m=+5621.452765688" Dec 09 04:45:40 crc kubenswrapper[4766]: I1209 04:45:40.732134 4766 generic.go:334] "Generic (PLEG): container finished" podID="b1a4b16b-b72d-4964-9b44-6b276ecf2eeb" containerID="57fccd2c1e9af4b151544b1817b148ea4fb4c4d690f084d9d99a822f0d9a29c3" exitCode=0 Dec 09 04:45:40 crc kubenswrapper[4766]: I1209 04:45:40.732361 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbw7w" event={"ID":"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb","Type":"ContainerDied","Data":"57fccd2c1e9af4b151544b1817b148ea4fb4c4d690f084d9d99a822f0d9a29c3"} Dec 09 04:45:40 crc kubenswrapper[4766]: I1209 04:45:40.735684 4766 generic.go:334] "Generic (PLEG): container finished" podID="713b0c31-e7ca-4673-90c1-f90879edd2ec" containerID="5a43adee1b5ea3ea7581e50e83c5ea968a364bd4ecd1692a4c77761a61106755" exitCode=0 Dec 09 04:45:40 crc kubenswrapper[4766]: I1209 04:45:40.735758 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c01d-account-create-update-jxlnd" event={"ID":"713b0c31-e7ca-4673-90c1-f90879edd2ec","Type":"ContainerDied","Data":"5a43adee1b5ea3ea7581e50e83c5ea968a364bd4ecd1692a4c77761a61106755"} Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.184677 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.192490 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.275685 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdnlj\" (UniqueName: \"kubernetes.io/projected/713b0c31-e7ca-4673-90c1-f90879edd2ec-kube-api-access-rdnlj\") pod \"713b0c31-e7ca-4673-90c1-f90879edd2ec\" (UID: \"713b0c31-e7ca-4673-90c1-f90879edd2ec\") " Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.275794 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/713b0c31-e7ca-4673-90c1-f90879edd2ec-operator-scripts\") pod \"713b0c31-e7ca-4673-90c1-f90879edd2ec\" (UID: \"713b0c31-e7ca-4673-90c1-f90879edd2ec\") " Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.275928 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-kube-api-access-4mt88\") pod \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\" (UID: \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\") " Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.275952 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-operator-scripts\") pod \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\" (UID: \"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb\") " Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.276802 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1a4b16b-b72d-4964-9b44-6b276ecf2eeb" (UID: "b1a4b16b-b72d-4964-9b44-6b276ecf2eeb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.276870 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713b0c31-e7ca-4673-90c1-f90879edd2ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "713b0c31-e7ca-4673-90c1-f90879edd2ec" (UID: "713b0c31-e7ca-4673-90c1-f90879edd2ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.282036 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-kube-api-access-4mt88" (OuterVolumeSpecName: "kube-api-access-4mt88") pod "b1a4b16b-b72d-4964-9b44-6b276ecf2eeb" (UID: "b1a4b16b-b72d-4964-9b44-6b276ecf2eeb"). InnerVolumeSpecName "kube-api-access-4mt88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.282204 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713b0c31-e7ca-4673-90c1-f90879edd2ec-kube-api-access-rdnlj" (OuterVolumeSpecName: "kube-api-access-rdnlj") pod "713b0c31-e7ca-4673-90c1-f90879edd2ec" (UID: "713b0c31-e7ca-4673-90c1-f90879edd2ec"). InnerVolumeSpecName "kube-api-access-rdnlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.378172 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/713b0c31-e7ca-4673-90c1-f90879edd2ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.378210 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mt88\" (UniqueName: \"kubernetes.io/projected/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-kube-api-access-4mt88\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.378236 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.378245 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdnlj\" (UniqueName: \"kubernetes.io/projected/713b0c31-e7ca-4673-90c1-f90879edd2ec-kube-api-access-rdnlj\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.770729 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbw7w" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.772365 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbw7w" event={"ID":"b1a4b16b-b72d-4964-9b44-6b276ecf2eeb","Type":"ContainerDied","Data":"7df925264383346b7088dc2aec83fc1e10da92f2e42b5d31ea850b1868841da5"} Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.772402 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7df925264383346b7088dc2aec83fc1e10da92f2e42b5d31ea850b1868841da5" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.774545 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c01d-account-create-update-jxlnd" event={"ID":"713b0c31-e7ca-4673-90c1-f90879edd2ec","Type":"ContainerDied","Data":"5eac6514767f6c2a1f235a581f4794996d9e048296d2ff38ba53e8127976f911"} Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.774573 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eac6514767f6c2a1f235a581f4794996d9e048296d2ff38ba53e8127976f911" Dec 09 04:45:42 crc kubenswrapper[4766]: I1209 04:45:42.774730 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c01d-account-create-update-jxlnd" Dec 09 04:45:42 crc kubenswrapper[4766]: E1209 04:45:42.850066 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod713b0c31_e7ca_4673_90c1_f90879edd2ec.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1a4b16b_b72d_4964_9b44_6b276ecf2eeb.slice/crio-7df925264383346b7088dc2aec83fc1e10da92f2e42b5d31ea850b1868841da5\": RecentStats: unable to find data in memory cache]" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.027603 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cff6f"] Dec 09 04:45:44 crc kubenswrapper[4766]: E1209 04:45:44.028348 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a4b16b-b72d-4964-9b44-6b276ecf2eeb" containerName="mariadb-database-create" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.028365 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a4b16b-b72d-4964-9b44-6b276ecf2eeb" containerName="mariadb-database-create" Dec 09 04:45:44 crc kubenswrapper[4766]: E1209 04:45:44.028393 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713b0c31-e7ca-4673-90c1-f90879edd2ec" containerName="mariadb-account-create-update" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.028402 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="713b0c31-e7ca-4673-90c1-f90879edd2ec" containerName="mariadb-account-create-update" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.028603 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a4b16b-b72d-4964-9b44-6b276ecf2eeb" containerName="mariadb-database-create" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.028627 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="713b0c31-e7ca-4673-90c1-f90879edd2ec" containerName="mariadb-account-create-update" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.029364 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.031177 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.031536 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4x627" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.032522 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.047002 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cff6f"] Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.110403 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-combined-ca-bundle\") pod \"neutron-db-sync-cff6f\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.110502 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q2pd\" (UniqueName: \"kubernetes.io/projected/da38c4b9-7fb3-44be-a93b-bbad803dd227-kube-api-access-2q2pd\") pod \"neutron-db-sync-cff6f\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.110559 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-config\") pod \"neutron-db-sync-cff6f\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.211993 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-combined-ca-bundle\") pod \"neutron-db-sync-cff6f\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.212078 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q2pd\" (UniqueName: \"kubernetes.io/projected/da38c4b9-7fb3-44be-a93b-bbad803dd227-kube-api-access-2q2pd\") pod \"neutron-db-sync-cff6f\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.212106 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-config\") pod \"neutron-db-sync-cff6f\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.216969 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-combined-ca-bundle\") pod \"neutron-db-sync-cff6f\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.222343 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-config\") pod \"neutron-db-sync-cff6f\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.233300 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q2pd\" (UniqueName: \"kubernetes.io/projected/da38c4b9-7fb3-44be-a93b-bbad803dd227-kube-api-access-2q2pd\") pod \"neutron-db-sync-cff6f\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.366849 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:44 crc kubenswrapper[4766]: I1209 04:45:44.878132 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cff6f"] Dec 09 04:45:45 crc kubenswrapper[4766]: I1209 04:45:45.814783 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cff6f" event={"ID":"da38c4b9-7fb3-44be-a93b-bbad803dd227","Type":"ContainerStarted","Data":"3d5d28d857d873b564683a7f1f1206d89186dfbf7fc1eb282ca9da5cece0ac10"} Dec 09 04:45:45 crc kubenswrapper[4766]: I1209 04:45:45.814835 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cff6f" event={"ID":"da38c4b9-7fb3-44be-a93b-bbad803dd227","Type":"ContainerStarted","Data":"12865907c8ca3ec0d21db96c4ed746427fad307502b3cb861b0986ad750b9dcc"} Dec 09 04:45:45 crc kubenswrapper[4766]: I1209 04:45:45.832516 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cff6f" podStartSLOduration=1.832497369 podStartE2EDuration="1.832497369s" podCreationTimestamp="2025-12-09 04:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:45.828518492 +0000 UTC m=+5627.537823918" watchObservedRunningTime="2025-12-09 04:45:45.832497369 +0000 UTC m=+5627.541802795" Dec 09 04:45:48 crc kubenswrapper[4766]: I1209 04:45:48.855570 4766 generic.go:334] "Generic (PLEG): container finished" podID="da38c4b9-7fb3-44be-a93b-bbad803dd227" containerID="3d5d28d857d873b564683a7f1f1206d89186dfbf7fc1eb282ca9da5cece0ac10" exitCode=0 Dec 09 04:45:48 crc kubenswrapper[4766]: I1209 04:45:48.855654 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cff6f" event={"ID":"da38c4b9-7fb3-44be-a93b-bbad803dd227","Type":"ContainerDied","Data":"3d5d28d857d873b564683a7f1f1206d89186dfbf7fc1eb282ca9da5cece0ac10"} Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.264945 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.315909 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-config\") pod \"da38c4b9-7fb3-44be-a93b-bbad803dd227\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.315959 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q2pd\" (UniqueName: \"kubernetes.io/projected/da38c4b9-7fb3-44be-a93b-bbad803dd227-kube-api-access-2q2pd\") pod \"da38c4b9-7fb3-44be-a93b-bbad803dd227\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.316064 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-combined-ca-bundle\") pod \"da38c4b9-7fb3-44be-a93b-bbad803dd227\" (UID: \"da38c4b9-7fb3-44be-a93b-bbad803dd227\") " Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.322544 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da38c4b9-7fb3-44be-a93b-bbad803dd227-kube-api-access-2q2pd" (OuterVolumeSpecName: "kube-api-access-2q2pd") pod "da38c4b9-7fb3-44be-a93b-bbad803dd227" (UID: "da38c4b9-7fb3-44be-a93b-bbad803dd227"). InnerVolumeSpecName "kube-api-access-2q2pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.338131 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-config" (OuterVolumeSpecName: "config") pod "da38c4b9-7fb3-44be-a93b-bbad803dd227" (UID: "da38c4b9-7fb3-44be-a93b-bbad803dd227"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.339862 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da38c4b9-7fb3-44be-a93b-bbad803dd227" (UID: "da38c4b9-7fb3-44be-a93b-bbad803dd227"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.418456 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.418490 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q2pd\" (UniqueName: \"kubernetes.io/projected/da38c4b9-7fb3-44be-a93b-bbad803dd227-kube-api-access-2q2pd\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.418503 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38c4b9-7fb3-44be-a93b-bbad803dd227-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.884104 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cff6f" event={"ID":"da38c4b9-7fb3-44be-a93b-bbad803dd227","Type":"ContainerDied","Data":"12865907c8ca3ec0d21db96c4ed746427fad307502b3cb861b0986ad750b9dcc"} Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.884203 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12865907c8ca3ec0d21db96c4ed746427fad307502b3cb861b0986ad750b9dcc" Dec 09 04:45:50 crc kubenswrapper[4766]: I1209 04:45:50.884300 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cff6f" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.090650 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-964d6fb57-5fj6k"] Dec 09 04:45:51 crc kubenswrapper[4766]: E1209 04:45:51.091972 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da38c4b9-7fb3-44be-a93b-bbad803dd227" containerName="neutron-db-sync" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.091993 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="da38c4b9-7fb3-44be-a93b-bbad803dd227" containerName="neutron-db-sync" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.092144 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="da38c4b9-7fb3-44be-a93b-bbad803dd227" containerName="neutron-db-sync" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.093960 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.105858 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-964d6fb57-5fj6k"] Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.133174 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-dns-svc\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.133246 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-config\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.133277 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-sb\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.133341 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwnxt\" (UniqueName: \"kubernetes.io/projected/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-kube-api-access-pwnxt\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.133377 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-nb\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.223290 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b4db9d857-5hs6p"] Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.225072 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.227566 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4x627" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.227772 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.235957 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.237256 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-dns-svc\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.237300 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-config\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.237327 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-sb\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.237376 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwnxt\" (UniqueName: \"kubernetes.io/projected/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-kube-api-access-pwnxt\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.237403 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-nb\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.238278 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-nb\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.238849 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-dns-svc\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.239101 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b4db9d857-5hs6p"] Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.239453 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-config\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.239989 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-sb\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.266037 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwnxt\" (UniqueName: \"kubernetes.io/projected/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-kube-api-access-pwnxt\") pod \"dnsmasq-dns-964d6fb57-5fj6k\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.339030 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1456b18-0735-4283-b268-f40d1fd42634-httpd-config\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.339072 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1456b18-0735-4283-b268-f40d1fd42634-config\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.339089 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8nl4\" (UniqueName: \"kubernetes.io/projected/d1456b18-0735-4283-b268-f40d1fd42634-kube-api-access-j8nl4\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.339198 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1456b18-0735-4283-b268-f40d1fd42634-combined-ca-bundle\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.440796 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1456b18-0735-4283-b268-f40d1fd42634-httpd-config\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.440852 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1456b18-0735-4283-b268-f40d1fd42634-config\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.440877 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8nl4\" (UniqueName: \"kubernetes.io/projected/d1456b18-0735-4283-b268-f40d1fd42634-kube-api-access-j8nl4\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.440927 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1456b18-0735-4283-b268-f40d1fd42634-combined-ca-bundle\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.445952 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1456b18-0735-4283-b268-f40d1fd42634-config\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.445972 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1456b18-0735-4283-b268-f40d1fd42634-combined-ca-bundle\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.446355 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.453737 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1456b18-0735-4283-b268-f40d1fd42634-httpd-config\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.460274 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8nl4\" (UniqueName: \"kubernetes.io/projected/d1456b18-0735-4283-b268-f40d1fd42634-kube-api-access-j8nl4\") pod \"neutron-6b4db9d857-5hs6p\" (UID: \"d1456b18-0735-4283-b268-f40d1fd42634\") " pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.545592 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:51 crc kubenswrapper[4766]: I1209 04:45:51.882195 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-964d6fb57-5fj6k"] Dec 09 04:45:51 crc kubenswrapper[4766]: W1209 04:45:51.891400 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5ae1fc2_f3db_41a7_bc26_d3e71eceff52.slice/crio-1f1312b0f891be9c9ea5548e1b229e7b69a245a5c79ac8eb7a7f67d5fb0caee2 WatchSource:0}: Error finding container 1f1312b0f891be9c9ea5548e1b229e7b69a245a5c79ac8eb7a7f67d5fb0caee2: Status 404 returned error can't find the container with id 1f1312b0f891be9c9ea5548e1b229e7b69a245a5c79ac8eb7a7f67d5fb0caee2 Dec 09 04:45:52 crc kubenswrapper[4766]: I1209 04:45:52.144931 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b4db9d857-5hs6p"] Dec 09 04:45:52 crc kubenswrapper[4766]: I1209 04:45:52.920421 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b4db9d857-5hs6p" event={"ID":"d1456b18-0735-4283-b268-f40d1fd42634","Type":"ContainerStarted","Data":"7f34f680c6078ff1793fedd859f1a8923e9d882a06124a7ddc65356676813fa0"} Dec 09 04:45:52 crc kubenswrapper[4766]: I1209 04:45:52.920697 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b4db9d857-5hs6p" event={"ID":"d1456b18-0735-4283-b268-f40d1fd42634","Type":"ContainerStarted","Data":"3e1f0335812343fef382d75322c3d5c824c249b3c7efa5e9a9e52cee1237cbc4"} Dec 09 04:45:52 crc kubenswrapper[4766]: I1209 04:45:52.920708 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b4db9d857-5hs6p" event={"ID":"d1456b18-0735-4283-b268-f40d1fd42634","Type":"ContainerStarted","Data":"4aaf5101bd7790ed7dd3191fd5043b7b3de8731490b3c3288b342aa81eecacbf"} Dec 09 04:45:52 crc kubenswrapper[4766]: I1209 04:45:52.920881 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:45:52 crc kubenswrapper[4766]: I1209 04:45:52.923813 4766 generic.go:334] "Generic (PLEG): container finished" podID="f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" containerID="33c5cc23d35a4934ce510e0413547e615b6bb9ab37218a8a3bf3959865ebded5" exitCode=0 Dec 09 04:45:52 crc kubenswrapper[4766]: I1209 04:45:52.923862 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" event={"ID":"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52","Type":"ContainerDied","Data":"33c5cc23d35a4934ce510e0413547e615b6bb9ab37218a8a3bf3959865ebded5"} Dec 09 04:45:52 crc kubenswrapper[4766]: I1209 04:45:52.923894 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" event={"ID":"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52","Type":"ContainerStarted","Data":"1f1312b0f891be9c9ea5548e1b229e7b69a245a5c79ac8eb7a7f67d5fb0caee2"} Dec 09 04:45:52 crc kubenswrapper[4766]: I1209 04:45:52.949478 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b4db9d857-5hs6p" podStartSLOduration=1.949455006 podStartE2EDuration="1.949455006s" podCreationTimestamp="2025-12-09 04:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:52.936715552 +0000 UTC m=+5634.646021028" watchObservedRunningTime="2025-12-09 04:45:52.949455006 +0000 UTC m=+5634.658760452" Dec 09 04:45:53 crc kubenswrapper[4766]: I1209 04:45:53.936180 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" event={"ID":"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52","Type":"ContainerStarted","Data":"702c030d30e9db08afaffe505e04964481715caa91e7da25b90fb75f51b20e69"} Dec 09 04:45:53 crc kubenswrapper[4766]: I1209 04:45:53.937124 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:46:01 crc kubenswrapper[4766]: I1209 04:46:01.448391 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:46:01 crc kubenswrapper[4766]: I1209 04:46:01.477628 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" podStartSLOduration=10.477610495 podStartE2EDuration="10.477610495s" podCreationTimestamp="2025-12-09 04:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:45:53.960582441 +0000 UTC m=+5635.669887877" watchObservedRunningTime="2025-12-09 04:46:01.477610495 +0000 UTC m=+5643.186915921" Dec 09 04:46:01 crc kubenswrapper[4766]: I1209 04:46:01.521374 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557b9c7599-xg6p4"] Dec 09 04:46:01 crc kubenswrapper[4766]: I1209 04:46:01.521745 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" podUID="4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" containerName="dnsmasq-dns" containerID="cri-o://55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263" gracePeriod=10 Dec 09 04:46:01 crc kubenswrapper[4766]: I1209 04:46:01.992878 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.021718 4766 generic.go:334] "Generic (PLEG): container finished" podID="4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" containerID="55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263" exitCode=0 Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.021782 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" event={"ID":"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99","Type":"ContainerDied","Data":"55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263"} Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.021817 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" event={"ID":"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99","Type":"ContainerDied","Data":"bbc47e436e015b6d9ff9dd451346e3acc703f9ee1b3ba921c22b2e045a1ff1b7"} Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.021840 4766 scope.go:117] "RemoveContainer" containerID="55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.022082 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557b9c7599-xg6p4" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.041515 4766 scope.go:117] "RemoveContainer" containerID="5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.052736 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-nb\") pod \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.053009 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqslz\" (UniqueName: \"kubernetes.io/projected/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-kube-api-access-zqslz\") pod \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.053083 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-dns-svc\") pod \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.053127 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-sb\") pod \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.053174 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-config\") pod \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\" (UID: \"4cfb75a0-0a92-474c-ad3f-3e0f27f28b99\") " Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.060964 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-kube-api-access-zqslz" (OuterVolumeSpecName: "kube-api-access-zqslz") pod "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" (UID: "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99"). InnerVolumeSpecName "kube-api-access-zqslz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.065682 4766 scope.go:117] "RemoveContainer" containerID="55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263" Dec 09 04:46:02 crc kubenswrapper[4766]: E1209 04:46:02.066684 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263\": container with ID starting with 55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263 not found: ID does not exist" containerID="55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.066741 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263"} err="failed to get container status \"55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263\": rpc error: code = NotFound desc = could not find container \"55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263\": container with ID starting with 55a1c9d633ec51e8c75d1d6f60e5e269a1976436734bc6c1e6975e6469c7a263 not found: ID does not exist" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.066776 4766 scope.go:117] "RemoveContainer" containerID="5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee" Dec 09 04:46:02 crc kubenswrapper[4766]: E1209 04:46:02.067330 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee\": container with ID starting with 5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee not found: ID does not exist" containerID="5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.067372 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee"} err="failed to get container status \"5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee\": rpc error: code = NotFound desc = could not find container \"5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee\": container with ID starting with 5cf09b8c709102e46d8d5f407454d815b5b2106d2bf1ae89bb5459428a9558ee not found: ID does not exist" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.098787 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" (UID: "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.114752 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" (UID: "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.116687 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" (UID: "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.124716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-config" (OuterVolumeSpecName: "config") pod "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" (UID: "4cfb75a0-0a92-474c-ad3f-3e0f27f28b99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.155509 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.155546 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.155560 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqslz\" (UniqueName: \"kubernetes.io/projected/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-kube-api-access-zqslz\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.155572 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.155584 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.360519 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557b9c7599-xg6p4"] Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.370098 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-557b9c7599-xg6p4"] Dec 09 04:46:02 crc kubenswrapper[4766]: I1209 04:46:02.849498 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" path="/var/lib/kubelet/pods/4cfb75a0-0a92-474c-ad3f-3e0f27f28b99/volumes" Dec 09 04:46:07 crc kubenswrapper[4766]: I1209 04:46:07.316845 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:46:07 crc kubenswrapper[4766]: I1209 04:46:07.317587 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:46:07 crc kubenswrapper[4766]: I1209 04:46:07.317676 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:46:07 crc kubenswrapper[4766]: I1209 04:46:07.318696 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:46:07 crc kubenswrapper[4766]: I1209 04:46:07.318797 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" gracePeriod=600 Dec 09 04:46:07 crc kubenswrapper[4766]: E1209 04:46:07.458574 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:46:08 crc kubenswrapper[4766]: I1209 04:46:08.097714 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" exitCode=0 Dec 09 04:46:08 crc kubenswrapper[4766]: I1209 04:46:08.097764 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d"} Dec 09 04:46:08 crc kubenswrapper[4766]: I1209 04:46:08.097812 4766 scope.go:117] "RemoveContainer" containerID="caec6864870b555babc6bdf649e5fc84b874ca528991ac1003b50ad5ab1fca38" Dec 09 04:46:08 crc kubenswrapper[4766]: I1209 04:46:08.098974 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:46:08 crc kubenswrapper[4766]: E1209 04:46:08.099601 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:46:21 crc kubenswrapper[4766]: I1209 04:46:21.559725 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b4db9d857-5hs6p" Dec 09 04:46:22 crc kubenswrapper[4766]: I1209 04:46:22.840915 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:46:22 crc kubenswrapper[4766]: E1209 04:46:22.841684 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.415677 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sjv24"] Dec 09 04:46:24 crc kubenswrapper[4766]: E1209 04:46:24.416190 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" containerName="init" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.416234 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" containerName="init" Dec 09 04:46:24 crc kubenswrapper[4766]: E1209 04:46:24.416277 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" containerName="dnsmasq-dns" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.416288 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" containerName="dnsmasq-dns" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.416558 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfb75a0-0a92-474c-ad3f-3e0f27f28b99" containerName="dnsmasq-dns" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.418601 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.425040 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sjv24"] Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.576450 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-catalog-content\") pod \"community-operators-sjv24\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.576783 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-utilities\") pod \"community-operators-sjv24\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.576843 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7w8p\" (UniqueName: \"kubernetes.io/projected/c780da5d-b87c-4980-91db-9c2e3a872303-kube-api-access-n7w8p\") pod \"community-operators-sjv24\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.678377 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-catalog-content\") pod \"community-operators-sjv24\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.678425 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-utilities\") pod \"community-operators-sjv24\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.678486 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7w8p\" (UniqueName: \"kubernetes.io/projected/c780da5d-b87c-4980-91db-9c2e3a872303-kube-api-access-n7w8p\") pod \"community-operators-sjv24\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.679143 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-catalog-content\") pod \"community-operators-sjv24\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.679201 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-utilities\") pod \"community-operators-sjv24\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.703590 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7w8p\" (UniqueName: \"kubernetes.io/projected/c780da5d-b87c-4980-91db-9c2e3a872303-kube-api-access-n7w8p\") pod \"community-operators-sjv24\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:24 crc kubenswrapper[4766]: I1209 04:46:24.807967 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:25 crc kubenswrapper[4766]: I1209 04:46:25.352426 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sjv24"] Dec 09 04:46:26 crc kubenswrapper[4766]: I1209 04:46:26.274709 4766 generic.go:334] "Generic (PLEG): container finished" podID="c780da5d-b87c-4980-91db-9c2e3a872303" containerID="418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9" exitCode=0 Dec 09 04:46:26 crc kubenswrapper[4766]: I1209 04:46:26.274761 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjv24" event={"ID":"c780da5d-b87c-4980-91db-9c2e3a872303","Type":"ContainerDied","Data":"418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9"} Dec 09 04:46:26 crc kubenswrapper[4766]: I1209 04:46:26.274985 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjv24" event={"ID":"c780da5d-b87c-4980-91db-9c2e3a872303","Type":"ContainerStarted","Data":"d08a3686fe6fd2e339349f865c7878e2ef3c21c43eb9da89c6ac959c44730c22"} Dec 09 04:46:27 crc kubenswrapper[4766]: I1209 04:46:27.284613 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjv24" event={"ID":"c780da5d-b87c-4980-91db-9c2e3a872303","Type":"ContainerStarted","Data":"2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7"} Dec 09 04:46:28 crc kubenswrapper[4766]: I1209 04:46:28.298615 4766 generic.go:334] "Generic (PLEG): container finished" podID="c780da5d-b87c-4980-91db-9c2e3a872303" containerID="2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7" exitCode=0 Dec 09 04:46:28 crc kubenswrapper[4766]: I1209 04:46:28.298728 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjv24" event={"ID":"c780da5d-b87c-4980-91db-9c2e3a872303","Type":"ContainerDied","Data":"2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7"} Dec 09 04:46:28 crc kubenswrapper[4766]: I1209 04:46:28.949833 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jhtfz"] Dec 09 04:46:28 crc kubenswrapper[4766]: I1209 04:46:28.951578 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:28 crc kubenswrapper[4766]: I1209 04:46:28.970186 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jhtfz"] Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.056776 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-30f3-account-create-update-gjmrm"] Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.058088 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.059540 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-operator-scripts\") pod \"glance-db-create-jhtfz\" (UID: \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\") " pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.059701 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbct\" (UniqueName: \"kubernetes.io/projected/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-kube-api-access-cvbct\") pod \"glance-db-create-jhtfz\" (UID: \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\") " pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.064562 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.067163 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-30f3-account-create-update-gjmrm"] Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.161284 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lwh8\" (UniqueName: \"kubernetes.io/projected/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-kube-api-access-5lwh8\") pod \"glance-30f3-account-create-update-gjmrm\" (UID: \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\") " pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.161373 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-operator-scripts\") pod \"glance-30f3-account-create-update-gjmrm\" (UID: \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\") " pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.161423 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbct\" (UniqueName: \"kubernetes.io/projected/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-kube-api-access-cvbct\") pod \"glance-db-create-jhtfz\" (UID: \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\") " pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.161459 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-operator-scripts\") pod \"glance-db-create-jhtfz\" (UID: \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\") " pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.162428 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-operator-scripts\") pod \"glance-db-create-jhtfz\" (UID: \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\") " pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.181723 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbct\" (UniqueName: \"kubernetes.io/projected/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-kube-api-access-cvbct\") pod \"glance-db-create-jhtfz\" (UID: \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\") " pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.263650 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-operator-scripts\") pod \"glance-30f3-account-create-update-gjmrm\" (UID: \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\") " pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.263777 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lwh8\" (UniqueName: \"kubernetes.io/projected/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-kube-api-access-5lwh8\") pod \"glance-30f3-account-create-update-gjmrm\" (UID: \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\") " pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.264816 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-operator-scripts\") pod \"glance-30f3-account-create-update-gjmrm\" (UID: \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\") " pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.270565 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.280978 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lwh8\" (UniqueName: \"kubernetes.io/projected/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-kube-api-access-5lwh8\") pod \"glance-30f3-account-create-update-gjmrm\" (UID: \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\") " pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.318829 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjv24" event={"ID":"c780da5d-b87c-4980-91db-9c2e3a872303","Type":"ContainerStarted","Data":"8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051"} Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.335599 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sjv24" podStartSLOduration=2.894142587 podStartE2EDuration="5.335578222s" podCreationTimestamp="2025-12-09 04:46:24 +0000 UTC" firstStartedPulling="2025-12-09 04:46:26.276307775 +0000 UTC m=+5667.985613241" lastFinishedPulling="2025-12-09 04:46:28.71774342 +0000 UTC m=+5670.427048876" observedRunningTime="2025-12-09 04:46:29.332564161 +0000 UTC m=+5671.041869587" watchObservedRunningTime="2025-12-09 04:46:29.335578222 +0000 UTC m=+5671.044883648" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.378154 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.785578 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jhtfz"] Dec 09 04:46:29 crc kubenswrapper[4766]: I1209 04:46:29.859101 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-30f3-account-create-update-gjmrm"] Dec 09 04:46:29 crc kubenswrapper[4766]: W1209 04:46:29.878570 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87e3cb19_20a7_4b09_8c06_dd804f95f4c6.slice/crio-bbdeebaa23239a2c1a6b15b9daa680007d694d1e0f7339fd899bceff930c3524 WatchSource:0}: Error finding container bbdeebaa23239a2c1a6b15b9daa680007d694d1e0f7339fd899bceff930c3524: Status 404 returned error can't find the container with id bbdeebaa23239a2c1a6b15b9daa680007d694d1e0f7339fd899bceff930c3524 Dec 09 04:46:30 crc kubenswrapper[4766]: I1209 04:46:30.327600 4766 generic.go:334] "Generic (PLEG): container finished" podID="87e3cb19-20a7-4b09-8c06-dd804f95f4c6" containerID="34554c6730d06ad95e63a9244cbcac645d21c7d11356346774cc0c03fb9927b5" exitCode=0 Dec 09 04:46:30 crc kubenswrapper[4766]: I1209 04:46:30.327665 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-30f3-account-create-update-gjmrm" event={"ID":"87e3cb19-20a7-4b09-8c06-dd804f95f4c6","Type":"ContainerDied","Data":"34554c6730d06ad95e63a9244cbcac645d21c7d11356346774cc0c03fb9927b5"} Dec 09 04:46:30 crc kubenswrapper[4766]: I1209 04:46:30.327953 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-30f3-account-create-update-gjmrm" event={"ID":"87e3cb19-20a7-4b09-8c06-dd804f95f4c6","Type":"ContainerStarted","Data":"bbdeebaa23239a2c1a6b15b9daa680007d694d1e0f7339fd899bceff930c3524"} Dec 09 04:46:30 crc kubenswrapper[4766]: I1209 04:46:30.330417 4766 generic.go:334] "Generic (PLEG): container finished" podID="f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8" containerID="a7ee0eafd72377b2e560719262daa08f31290c148cfffda275689057d7aa4b36" exitCode=0 Dec 09 04:46:30 crc kubenswrapper[4766]: I1209 04:46:30.331701 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jhtfz" event={"ID":"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8","Type":"ContainerDied","Data":"a7ee0eafd72377b2e560719262daa08f31290c148cfffda275689057d7aa4b36"} Dec 09 04:46:30 crc kubenswrapper[4766]: I1209 04:46:30.331747 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jhtfz" event={"ID":"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8","Type":"ContainerStarted","Data":"bf7a1a8f6a1908374bda26731aca82a01dd79440e3c6cae394b47370578440fa"} Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.774587 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.784828 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.840529 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lwh8\" (UniqueName: \"kubernetes.io/projected/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-kube-api-access-5lwh8\") pod \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\" (UID: \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\") " Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.840761 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-operator-scripts\") pod \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\" (UID: \"87e3cb19-20a7-4b09-8c06-dd804f95f4c6\") " Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.840936 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbct\" (UniqueName: \"kubernetes.io/projected/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-kube-api-access-cvbct\") pod \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\" (UID: \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\") " Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.841085 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-operator-scripts\") pod \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\" (UID: \"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8\") " Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.841162 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87e3cb19-20a7-4b09-8c06-dd804f95f4c6" (UID: "87e3cb19-20a7-4b09-8c06-dd804f95f4c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.841570 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8" (UID: "f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.841769 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.846274 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-kube-api-access-cvbct" (OuterVolumeSpecName: "kube-api-access-cvbct") pod "f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8" (UID: "f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8"). InnerVolumeSpecName "kube-api-access-cvbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.858845 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-kube-api-access-5lwh8" (OuterVolumeSpecName: "kube-api-access-5lwh8") pod "87e3cb19-20a7-4b09-8c06-dd804f95f4c6" (UID: "87e3cb19-20a7-4b09-8c06-dd804f95f4c6"). InnerVolumeSpecName "kube-api-access-5lwh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.942904 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.942939 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lwh8\" (UniqueName: \"kubernetes.io/projected/87e3cb19-20a7-4b09-8c06-dd804f95f4c6-kube-api-access-5lwh8\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:31 crc kubenswrapper[4766]: I1209 04:46:31.942951 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbct\" (UniqueName: \"kubernetes.io/projected/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8-kube-api-access-cvbct\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:32 crc kubenswrapper[4766]: I1209 04:46:32.356268 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-30f3-account-create-update-gjmrm" event={"ID":"87e3cb19-20a7-4b09-8c06-dd804f95f4c6","Type":"ContainerDied","Data":"bbdeebaa23239a2c1a6b15b9daa680007d694d1e0f7339fd899bceff930c3524"} Dec 09 04:46:32 crc kubenswrapper[4766]: I1209 04:46:32.356691 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbdeebaa23239a2c1a6b15b9daa680007d694d1e0f7339fd899bceff930c3524" Dec 09 04:46:32 crc kubenswrapper[4766]: I1209 04:46:32.356309 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30f3-account-create-update-gjmrm" Dec 09 04:46:32 crc kubenswrapper[4766]: I1209 04:46:32.359150 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jhtfz" event={"ID":"f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8","Type":"ContainerDied","Data":"bf7a1a8f6a1908374bda26731aca82a01dd79440e3c6cae394b47370578440fa"} Dec 09 04:46:32 crc kubenswrapper[4766]: I1209 04:46:32.359206 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf7a1a8f6a1908374bda26731aca82a01dd79440e3c6cae394b47370578440fa" Dec 09 04:46:32 crc kubenswrapper[4766]: I1209 04:46:32.359248 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jhtfz" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.235791 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-kgh24"] Dec 09 04:46:34 crc kubenswrapper[4766]: E1209 04:46:34.236505 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e3cb19-20a7-4b09-8c06-dd804f95f4c6" containerName="mariadb-account-create-update" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.236521 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e3cb19-20a7-4b09-8c06-dd804f95f4c6" containerName="mariadb-account-create-update" Dec 09 04:46:34 crc kubenswrapper[4766]: E1209 04:46:34.236546 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8" containerName="mariadb-database-create" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.236554 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8" containerName="mariadb-database-create" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.236746 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8" containerName="mariadb-database-create" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.236770 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e3cb19-20a7-4b09-8c06-dd804f95f4c6" containerName="mariadb-account-create-update" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.237394 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.240430 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-465hd" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.240695 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.252088 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kgh24"] Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.288855 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-config-data\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.288932 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-db-sync-config-data\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.289022 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-combined-ca-bundle\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.289096 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz8qk\" (UniqueName: \"kubernetes.io/projected/d0f19d25-4821-4122-9149-1a479747837a-kube-api-access-sz8qk\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.390314 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz8qk\" (UniqueName: \"kubernetes.io/projected/d0f19d25-4821-4122-9149-1a479747837a-kube-api-access-sz8qk\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.390399 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-config-data\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.390433 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-db-sync-config-data\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.390462 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-combined-ca-bundle\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.395806 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-db-sync-config-data\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.396928 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-combined-ca-bundle\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.397181 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-config-data\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.426632 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz8qk\" (UniqueName: \"kubernetes.io/projected/d0f19d25-4821-4122-9149-1a479747837a-kube-api-access-sz8qk\") pod \"glance-db-sync-kgh24\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.577602 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.808812 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.809029 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:34 crc kubenswrapper[4766]: I1209 04:46:34.860276 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:35 crc kubenswrapper[4766]: I1209 04:46:35.200497 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kgh24"] Dec 09 04:46:35 crc kubenswrapper[4766]: I1209 04:46:35.383558 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kgh24" event={"ID":"d0f19d25-4821-4122-9149-1a479747837a","Type":"ContainerStarted","Data":"0a402b2314db2b0481b6d471cb36d10eca4c3577e920889eb78181339c257a73"} Dec 09 04:46:35 crc kubenswrapper[4766]: I1209 04:46:35.445624 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:35 crc kubenswrapper[4766]: I1209 04:46:35.491740 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sjv24"] Dec 09 04:46:36 crc kubenswrapper[4766]: I1209 04:46:36.394032 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kgh24" event={"ID":"d0f19d25-4821-4122-9149-1a479747837a","Type":"ContainerStarted","Data":"df54a1606f35d6b2cd3a23d7de1d52b269150e4e33b02655dae5c63ebc86c17d"} Dec 09 04:46:36 crc kubenswrapper[4766]: I1209 04:46:36.413530 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-kgh24" podStartSLOduration=2.413508774 podStartE2EDuration="2.413508774s" podCreationTimestamp="2025-12-09 04:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:46:36.410713768 +0000 UTC m=+5678.120019234" watchObservedRunningTime="2025-12-09 04:46:36.413508774 +0000 UTC m=+5678.122814200" Dec 09 04:46:36 crc kubenswrapper[4766]: I1209 04:46:36.839705 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:46:36 crc kubenswrapper[4766]: E1209 04:46:36.840320 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:46:37 crc kubenswrapper[4766]: I1209 04:46:37.407653 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sjv24" podUID="c780da5d-b87c-4980-91db-9c2e3a872303" containerName="registry-server" containerID="cri-o://8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051" gracePeriod=2 Dec 09 04:46:37 crc kubenswrapper[4766]: I1209 04:46:37.791105 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:37 crc kubenswrapper[4766]: I1209 04:46:37.855295 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-catalog-content\") pod \"c780da5d-b87c-4980-91db-9c2e3a872303\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " Dec 09 04:46:37 crc kubenswrapper[4766]: I1209 04:46:37.855769 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7w8p\" (UniqueName: \"kubernetes.io/projected/c780da5d-b87c-4980-91db-9c2e3a872303-kube-api-access-n7w8p\") pod \"c780da5d-b87c-4980-91db-9c2e3a872303\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " Dec 09 04:46:37 crc kubenswrapper[4766]: I1209 04:46:37.855924 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-utilities\") pod \"c780da5d-b87c-4980-91db-9c2e3a872303\" (UID: \"c780da5d-b87c-4980-91db-9c2e3a872303\") " Dec 09 04:46:37 crc kubenswrapper[4766]: I1209 04:46:37.857007 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-utilities" (OuterVolumeSpecName: "utilities") pod "c780da5d-b87c-4980-91db-9c2e3a872303" (UID: "c780da5d-b87c-4980-91db-9c2e3a872303"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:46:37 crc kubenswrapper[4766]: I1209 04:46:37.865613 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c780da5d-b87c-4980-91db-9c2e3a872303-kube-api-access-n7w8p" (OuterVolumeSpecName: "kube-api-access-n7w8p") pod "c780da5d-b87c-4980-91db-9c2e3a872303" (UID: "c780da5d-b87c-4980-91db-9c2e3a872303"). InnerVolumeSpecName "kube-api-access-n7w8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:37 crc kubenswrapper[4766]: I1209 04:46:37.958772 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:37 crc kubenswrapper[4766]: I1209 04:46:37.959019 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7w8p\" (UniqueName: \"kubernetes.io/projected/c780da5d-b87c-4980-91db-9c2e3a872303-kube-api-access-n7w8p\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.037424 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c780da5d-b87c-4980-91db-9c2e3a872303" (UID: "c780da5d-b87c-4980-91db-9c2e3a872303"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.062697 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c780da5d-b87c-4980-91db-9c2e3a872303-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.424601 4766 generic.go:334] "Generic (PLEG): container finished" podID="c780da5d-b87c-4980-91db-9c2e3a872303" containerID="8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051" exitCode=0 Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.424675 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sjv24" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.424685 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjv24" event={"ID":"c780da5d-b87c-4980-91db-9c2e3a872303","Type":"ContainerDied","Data":"8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051"} Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.424743 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sjv24" event={"ID":"c780da5d-b87c-4980-91db-9c2e3a872303","Type":"ContainerDied","Data":"d08a3686fe6fd2e339349f865c7878e2ef3c21c43eb9da89c6ac959c44730c22"} Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.424772 4766 scope.go:117] "RemoveContainer" containerID="8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.474254 4766 scope.go:117] "RemoveContainer" containerID="2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.490192 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sjv24"] Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.506918 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sjv24"] Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.514931 4766 scope.go:117] "RemoveContainer" containerID="418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.548722 4766 scope.go:117] "RemoveContainer" containerID="8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051" Dec 09 04:46:38 crc kubenswrapper[4766]: E1209 04:46:38.549513 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051\": container with ID starting with 8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051 not found: ID does not exist" containerID="8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.549580 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051"} err="failed to get container status \"8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051\": rpc error: code = NotFound desc = could not find container \"8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051\": container with ID starting with 8ef0fcd07657fcc3976b2ea73555385fda7e343bd7b06631070b5b70950ae051 not found: ID does not exist" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.549626 4766 scope.go:117] "RemoveContainer" containerID="2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7" Dec 09 04:46:38 crc kubenswrapper[4766]: E1209 04:46:38.549962 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7\": container with ID starting with 2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7 not found: ID does not exist" containerID="2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.549998 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7"} err="failed to get container status \"2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7\": rpc error: code = NotFound desc = could not find container \"2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7\": container with ID starting with 2ada3403aab7f885fec93f6116ad0fbe1d7356da43926ab7de31a9ed7306eab7 not found: ID does not exist" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.550020 4766 scope.go:117] "RemoveContainer" containerID="418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9" Dec 09 04:46:38 crc kubenswrapper[4766]: E1209 04:46:38.550573 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9\": container with ID starting with 418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9 not found: ID does not exist" containerID="418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.550675 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9"} err="failed to get container status \"418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9\": rpc error: code = NotFound desc = could not find container \"418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9\": container with ID starting with 418f964bf7097f0122c15df2601fb60a6b1a1c0833199cc70fa31d8bbd4ef4d9 not found: ID does not exist" Dec 09 04:46:38 crc kubenswrapper[4766]: I1209 04:46:38.851823 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c780da5d-b87c-4980-91db-9c2e3a872303" path="/var/lib/kubelet/pods/c780da5d-b87c-4980-91db-9c2e3a872303/volumes" Dec 09 04:46:39 crc kubenswrapper[4766]: I1209 04:46:39.439617 4766 generic.go:334] "Generic (PLEG): container finished" podID="d0f19d25-4821-4122-9149-1a479747837a" containerID="df54a1606f35d6b2cd3a23d7de1d52b269150e4e33b02655dae5c63ebc86c17d" exitCode=0 Dec 09 04:46:39 crc kubenswrapper[4766]: I1209 04:46:39.439717 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kgh24" event={"ID":"d0f19d25-4821-4122-9149-1a479747837a","Type":"ContainerDied","Data":"df54a1606f35d6b2cd3a23d7de1d52b269150e4e33b02655dae5c63ebc86c17d"} Dec 09 04:46:40 crc kubenswrapper[4766]: I1209 04:46:40.895438 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.025783 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz8qk\" (UniqueName: \"kubernetes.io/projected/d0f19d25-4821-4122-9149-1a479747837a-kube-api-access-sz8qk\") pod \"d0f19d25-4821-4122-9149-1a479747837a\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.025847 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-db-sync-config-data\") pod \"d0f19d25-4821-4122-9149-1a479747837a\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.025955 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-config-data\") pod \"d0f19d25-4821-4122-9149-1a479747837a\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.026088 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-combined-ca-bundle\") pod \"d0f19d25-4821-4122-9149-1a479747837a\" (UID: \"d0f19d25-4821-4122-9149-1a479747837a\") " Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.032812 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f19d25-4821-4122-9149-1a479747837a-kube-api-access-sz8qk" (OuterVolumeSpecName: "kube-api-access-sz8qk") pod "d0f19d25-4821-4122-9149-1a479747837a" (UID: "d0f19d25-4821-4122-9149-1a479747837a"). InnerVolumeSpecName "kube-api-access-sz8qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.033266 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d0f19d25-4821-4122-9149-1a479747837a" (UID: "d0f19d25-4821-4122-9149-1a479747837a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.058947 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0f19d25-4821-4122-9149-1a479747837a" (UID: "d0f19d25-4821-4122-9149-1a479747837a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.076996 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-config-data" (OuterVolumeSpecName: "config-data") pod "d0f19d25-4821-4122-9149-1a479747837a" (UID: "d0f19d25-4821-4122-9149-1a479747837a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.129237 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.129283 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz8qk\" (UniqueName: \"kubernetes.io/projected/d0f19d25-4821-4122-9149-1a479747837a-kube-api-access-sz8qk\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.129297 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.129312 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f19d25-4821-4122-9149-1a479747837a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.468414 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kgh24" event={"ID":"d0f19d25-4821-4122-9149-1a479747837a","Type":"ContainerDied","Data":"0a402b2314db2b0481b6d471cb36d10eca4c3577e920889eb78181339c257a73"} Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.468476 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a402b2314db2b0481b6d471cb36d10eca4c3577e920889eb78181339c257a73" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.468566 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kgh24" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.809681 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:46:41 crc kubenswrapper[4766]: E1209 04:46:41.810021 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c780da5d-b87c-4980-91db-9c2e3a872303" containerName="registry-server" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.810037 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c780da5d-b87c-4980-91db-9c2e3a872303" containerName="registry-server" Dec 09 04:46:41 crc kubenswrapper[4766]: E1209 04:46:41.810062 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c780da5d-b87c-4980-91db-9c2e3a872303" containerName="extract-utilities" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.810069 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c780da5d-b87c-4980-91db-9c2e3a872303" containerName="extract-utilities" Dec 09 04:46:41 crc kubenswrapper[4766]: E1209 04:46:41.810082 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f19d25-4821-4122-9149-1a479747837a" containerName="glance-db-sync" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.810089 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f19d25-4821-4122-9149-1a479747837a" containerName="glance-db-sync" Dec 09 04:46:41 crc kubenswrapper[4766]: E1209 04:46:41.810108 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c780da5d-b87c-4980-91db-9c2e3a872303" containerName="extract-content" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.810115 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c780da5d-b87c-4980-91db-9c2e3a872303" containerName="extract-content" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.810619 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f19d25-4821-4122-9149-1a479747837a" containerName="glance-db-sync" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.810659 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c780da5d-b87c-4980-91db-9c2e3a872303" containerName="registry-server" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.811549 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.825011 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.833810 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.833988 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.834090 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-465hd" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.834196 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.846281 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-logs\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.846356 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-config-data\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.846408 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.846434 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8bbs\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-kube-api-access-h8bbs\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.846453 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.846473 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-scripts\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.846497 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-ceph\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.944747 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56ffb59c77-b4j2r"] Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.946726 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.947707 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-config-data\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.947765 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.947795 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bbs\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-kube-api-access-h8bbs\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.947814 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.947835 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-scripts\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.947859 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-ceph\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.947897 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-logs\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.948364 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-logs\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.948595 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.954654 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-scripts\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.962595 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.967860 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56ffb59c77-b4j2r"] Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.971509 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-ceph\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.973716 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-config-data\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:41 crc kubenswrapper[4766]: I1209 04:46:41.993648 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8bbs\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-kube-api-access-h8bbs\") pod \"glance-default-external-api-0\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.036370 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.037629 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.049564 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.051256 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-dns-svc\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.051306 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7dh4\" (UniqueName: \"kubernetes.io/projected/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-kube-api-access-d7dh4\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.051342 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-nb\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.051387 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-sb\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.051426 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-config\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.055793 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153191 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153349 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-dns-svc\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153402 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153438 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7dh4\" (UniqueName: \"kubernetes.io/projected/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-kube-api-access-d7dh4\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153462 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-logs\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153487 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153519 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74blq\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-kube-api-access-74blq\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153543 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-nb\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153602 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-sb\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153679 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.153703 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-config\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.154921 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-config\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.155052 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-nb\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.155739 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-sb\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.155936 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-dns-svc\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.164460 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.182373 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7dh4\" (UniqueName: \"kubernetes.io/projected/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-kube-api-access-d7dh4\") pod \"dnsmasq-dns-56ffb59c77-b4j2r\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.255661 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.255991 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.256029 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.256100 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.256141 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-logs\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.256168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.256199 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74blq\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-kube-api-access-74blq\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.259517 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-logs\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.259553 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.263566 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.266333 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.267471 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.275804 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.277642 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74blq\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-kube-api-access-74blq\") pod \"glance-default-internal-api-0\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.362875 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.370694 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.728732 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.836291 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:46:42 crc kubenswrapper[4766]: I1209 04:46:42.852400 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56ffb59c77-b4j2r"] Dec 09 04:46:42 crc kubenswrapper[4766]: W1209 04:46:42.858192 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3dd023b_bcc3_4f74_b1ab_f29801cd7643.slice/crio-69c07bdf56466ceade4c58603647a3b88a82eb889619b5bb2e2ca85f38483614 WatchSource:0}: Error finding container 69c07bdf56466ceade4c58603647a3b88a82eb889619b5bb2e2ca85f38483614: Status 404 returned error can't find the container with id 69c07bdf56466ceade4c58603647a3b88a82eb889619b5bb2e2ca85f38483614 Dec 09 04:46:43 crc kubenswrapper[4766]: I1209 04:46:43.034603 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:46:43 crc kubenswrapper[4766]: I1209 04:46:43.502128 4766 generic.go:334] "Generic (PLEG): container finished" podID="b3dd023b-bcc3-4f74-b1ab-f29801cd7643" containerID="150fad522e4623716a6f75420d45933346130ba8caaada3df9463c8aa10cbac1" exitCode=0 Dec 09 04:46:43 crc kubenswrapper[4766]: I1209 04:46:43.502270 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" event={"ID":"b3dd023b-bcc3-4f74-b1ab-f29801cd7643","Type":"ContainerDied","Data":"150fad522e4623716a6f75420d45933346130ba8caaada3df9463c8aa10cbac1"} Dec 09 04:46:43 crc kubenswrapper[4766]: I1209 04:46:43.503506 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" event={"ID":"b3dd023b-bcc3-4f74-b1ab-f29801cd7643","Type":"ContainerStarted","Data":"69c07bdf56466ceade4c58603647a3b88a82eb889619b5bb2e2ca85f38483614"} Dec 09 04:46:43 crc kubenswrapper[4766]: I1209 04:46:43.507264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfb0727d-7f01-4bdd-9448-96db00678c41","Type":"ContainerStarted","Data":"a95f0ef8efa0a87fbf448ae6ce03803a790571930a7f55e0501062321a6b902b"} Dec 09 04:46:43 crc kubenswrapper[4766]: I1209 04:46:43.514757 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d157befc-8015-4b97-af39-a5bf8762345a","Type":"ContainerStarted","Data":"48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45"} Dec 09 04:46:43 crc kubenswrapper[4766]: I1209 04:46:43.514801 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d157befc-8015-4b97-af39-a5bf8762345a","Type":"ContainerStarted","Data":"058b928f32cb3254b9785a61eb6d2188aff4ce93f5cf56faa7528971c8b6f583"} Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.527121 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfb0727d-7f01-4bdd-9448-96db00678c41","Type":"ContainerStarted","Data":"4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57"} Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.527698 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfb0727d-7f01-4bdd-9448-96db00678c41","Type":"ContainerStarted","Data":"a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9"} Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.530973 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d157befc-8015-4b97-af39-a5bf8762345a","Type":"ContainerStarted","Data":"d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9"} Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.531119 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d157befc-8015-4b97-af39-a5bf8762345a" containerName="glance-log" containerID="cri-o://48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45" gracePeriod=30 Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.531391 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d157befc-8015-4b97-af39-a5bf8762345a" containerName="glance-httpd" containerID="cri-o://d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9" gracePeriod=30 Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.535366 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" event={"ID":"b3dd023b-bcc3-4f74-b1ab-f29801cd7643","Type":"ContainerStarted","Data":"686e1f6ad644303c5903b9d33cc07198eca6f83b86deb6d9e18c67f456882670"} Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.535602 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.561576 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.561550245 podStartE2EDuration="2.561550245s" podCreationTimestamp="2025-12-09 04:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:46:44.55688461 +0000 UTC m=+5686.266190046" watchObservedRunningTime="2025-12-09 04:46:44.561550245 +0000 UTC m=+5686.270855671" Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.573704 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.573685074 podStartE2EDuration="3.573685074s" podCreationTimestamp="2025-12-09 04:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:46:44.572617544 +0000 UTC m=+5686.281922960" watchObservedRunningTime="2025-12-09 04:46:44.573685074 +0000 UTC m=+5686.282990500" Dec 09 04:46:44 crc kubenswrapper[4766]: I1209 04:46:44.592392 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" podStartSLOduration=3.592372938 podStartE2EDuration="3.592372938s" podCreationTimestamp="2025-12-09 04:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:46:44.58796604 +0000 UTC m=+5686.297271466" watchObservedRunningTime="2025-12-09 04:46:44.592372938 +0000 UTC m=+5686.301678364" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.133581 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.506169 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.526284 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-combined-ca-bundle\") pod \"d157befc-8015-4b97-af39-a5bf8762345a\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.526338 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-logs\") pod \"d157befc-8015-4b97-af39-a5bf8762345a\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.526393 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8bbs\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-kube-api-access-h8bbs\") pod \"d157befc-8015-4b97-af39-a5bf8762345a\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.526479 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-scripts\") pod \"d157befc-8015-4b97-af39-a5bf8762345a\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.526507 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-httpd-run\") pod \"d157befc-8015-4b97-af39-a5bf8762345a\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.526591 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-config-data\") pod \"d157befc-8015-4b97-af39-a5bf8762345a\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.526641 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-ceph\") pod \"d157befc-8015-4b97-af39-a5bf8762345a\" (UID: \"d157befc-8015-4b97-af39-a5bf8762345a\") " Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.527572 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d157befc-8015-4b97-af39-a5bf8762345a" (UID: "d157befc-8015-4b97-af39-a5bf8762345a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.527827 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-logs" (OuterVolumeSpecName: "logs") pod "d157befc-8015-4b97-af39-a5bf8762345a" (UID: "d157befc-8015-4b97-af39-a5bf8762345a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.535421 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-kube-api-access-h8bbs" (OuterVolumeSpecName: "kube-api-access-h8bbs") pod "d157befc-8015-4b97-af39-a5bf8762345a" (UID: "d157befc-8015-4b97-af39-a5bf8762345a"). InnerVolumeSpecName "kube-api-access-h8bbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.535569 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-ceph" (OuterVolumeSpecName: "ceph") pod "d157befc-8015-4b97-af39-a5bf8762345a" (UID: "d157befc-8015-4b97-af39-a5bf8762345a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.556494 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d157befc-8015-4b97-af39-a5bf8762345a" (UID: "d157befc-8015-4b97-af39-a5bf8762345a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.557499 4766 generic.go:334] "Generic (PLEG): container finished" podID="d157befc-8015-4b97-af39-a5bf8762345a" containerID="d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9" exitCode=0 Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.557524 4766 generic.go:334] "Generic (PLEG): container finished" podID="d157befc-8015-4b97-af39-a5bf8762345a" containerID="48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45" exitCode=143 Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.558518 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.559017 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d157befc-8015-4b97-af39-a5bf8762345a","Type":"ContainerDied","Data":"d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9"} Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.559052 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d157befc-8015-4b97-af39-a5bf8762345a","Type":"ContainerDied","Data":"48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45"} Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.559065 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d157befc-8015-4b97-af39-a5bf8762345a","Type":"ContainerDied","Data":"058b928f32cb3254b9785a61eb6d2188aff4ce93f5cf56faa7528971c8b6f583"} Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.559085 4766 scope.go:117] "RemoveContainer" containerID="d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.565315 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-scripts" (OuterVolumeSpecName: "scripts") pod "d157befc-8015-4b97-af39-a5bf8762345a" (UID: "d157befc-8015-4b97-af39-a5bf8762345a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.592453 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-config-data" (OuterVolumeSpecName: "config-data") pod "d157befc-8015-4b97-af39-a5bf8762345a" (UID: "d157befc-8015-4b97-af39-a5bf8762345a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.630466 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.630537 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.630551 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.630586 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.630606 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d157befc-8015-4b97-af39-a5bf8762345a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.630621 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d157befc-8015-4b97-af39-a5bf8762345a-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.630632 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8bbs\" (UniqueName: \"kubernetes.io/projected/d157befc-8015-4b97-af39-a5bf8762345a-kube-api-access-h8bbs\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.688490 4766 scope.go:117] "RemoveContainer" containerID="48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.707031 4766 scope.go:117] "RemoveContainer" containerID="d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9" Dec 09 04:46:45 crc kubenswrapper[4766]: E1209 04:46:45.707750 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9\": container with ID starting with d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9 not found: ID does not exist" containerID="d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.707822 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9"} err="failed to get container status \"d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9\": rpc error: code = NotFound desc = could not find container \"d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9\": container with ID starting with d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9 not found: ID does not exist" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.707850 4766 scope.go:117] "RemoveContainer" containerID="48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45" Dec 09 04:46:45 crc kubenswrapper[4766]: E1209 04:46:45.708326 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45\": container with ID starting with 48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45 not found: ID does not exist" containerID="48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.708349 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45"} err="failed to get container status \"48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45\": rpc error: code = NotFound desc = could not find container \"48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45\": container with ID starting with 48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45 not found: ID does not exist" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.708363 4766 scope.go:117] "RemoveContainer" containerID="d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.708602 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9"} err="failed to get container status \"d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9\": rpc error: code = NotFound desc = could not find container \"d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9\": container with ID starting with d159f3e6c53637f06eb3be803811e76431012358154d0da093aecbe61cddadc9 not found: ID does not exist" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.708640 4766 scope.go:117] "RemoveContainer" containerID="48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.709161 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45"} err="failed to get container status \"48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45\": rpc error: code = NotFound desc = could not find container \"48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45\": container with ID starting with 48d225301324c4d0829d7ca53dacbf45608e95cb555aaf5c2086a997283c9f45 not found: ID does not exist" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.888630 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.894996 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.920263 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:46:45 crc kubenswrapper[4766]: E1209 04:46:45.920660 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d157befc-8015-4b97-af39-a5bf8762345a" containerName="glance-log" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.920677 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d157befc-8015-4b97-af39-a5bf8762345a" containerName="glance-log" Dec 09 04:46:45 crc kubenswrapper[4766]: E1209 04:46:45.920703 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d157befc-8015-4b97-af39-a5bf8762345a" containerName="glance-httpd" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.920710 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d157befc-8015-4b97-af39-a5bf8762345a" containerName="glance-httpd" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.920899 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d157befc-8015-4b97-af39-a5bf8762345a" containerName="glance-log" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.920920 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d157befc-8015-4b97-af39-a5bf8762345a" containerName="glance-httpd" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.921821 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.924933 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 04:46:45 crc kubenswrapper[4766]: I1209 04:46:45.931145 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.037797 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-scripts\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.037868 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw74x\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-kube-api-access-xw74x\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.037893 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.037921 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-logs\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.037952 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-ceph\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.037975 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-config-data\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.037995 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.140372 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-ceph\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.140520 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-config-data\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.140580 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.140764 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-scripts\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.140912 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw74x\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-kube-api-access-xw74x\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.140968 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.141055 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-logs\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.141689 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-logs\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.141737 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.144428 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-scripts\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.144669 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.145481 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-config-data\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.145675 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-ceph\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.158892 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw74x\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-kube-api-access-xw74x\") pod \"glance-default-external-api-0\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.239624 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.565996 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerName="glance-log" containerID="cri-o://a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9" gracePeriod=30 Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.566140 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerName="glance-httpd" containerID="cri-o://4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57" gracePeriod=30 Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.762807 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:46:46 crc kubenswrapper[4766]: I1209 04:46:46.869174 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d157befc-8015-4b97-af39-a5bf8762345a" path="/var/lib/kubelet/pods/d157befc-8015-4b97-af39-a5bf8762345a/volumes" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.243099 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.359594 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-httpd-run\") pod \"cfb0727d-7f01-4bdd-9448-96db00678c41\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.359940 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-ceph\") pod \"cfb0727d-7f01-4bdd-9448-96db00678c41\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.360034 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-combined-ca-bundle\") pod \"cfb0727d-7f01-4bdd-9448-96db00678c41\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.360056 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-logs\") pod \"cfb0727d-7f01-4bdd-9448-96db00678c41\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.360114 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74blq\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-kube-api-access-74blq\") pod \"cfb0727d-7f01-4bdd-9448-96db00678c41\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.360143 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-config-data\") pod \"cfb0727d-7f01-4bdd-9448-96db00678c41\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.360285 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-scripts\") pod \"cfb0727d-7f01-4bdd-9448-96db00678c41\" (UID: \"cfb0727d-7f01-4bdd-9448-96db00678c41\") " Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.360820 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cfb0727d-7f01-4bdd-9448-96db00678c41" (UID: "cfb0727d-7f01-4bdd-9448-96db00678c41"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.360849 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-logs" (OuterVolumeSpecName: "logs") pod "cfb0727d-7f01-4bdd-9448-96db00678c41" (UID: "cfb0727d-7f01-4bdd-9448-96db00678c41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.363158 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-ceph" (OuterVolumeSpecName: "ceph") pod "cfb0727d-7f01-4bdd-9448-96db00678c41" (UID: "cfb0727d-7f01-4bdd-9448-96db00678c41"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.366601 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-kube-api-access-74blq" (OuterVolumeSpecName: "kube-api-access-74blq") pod "cfb0727d-7f01-4bdd-9448-96db00678c41" (UID: "cfb0727d-7f01-4bdd-9448-96db00678c41"). InnerVolumeSpecName "kube-api-access-74blq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.376187 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-scripts" (OuterVolumeSpecName: "scripts") pod "cfb0727d-7f01-4bdd-9448-96db00678c41" (UID: "cfb0727d-7f01-4bdd-9448-96db00678c41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.401441 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfb0727d-7f01-4bdd-9448-96db00678c41" (UID: "cfb0727d-7f01-4bdd-9448-96db00678c41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.426535 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-config-data" (OuterVolumeSpecName: "config-data") pod "cfb0727d-7f01-4bdd-9448-96db00678c41" (UID: "cfb0727d-7f01-4bdd-9448-96db00678c41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.461845 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74blq\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-kube-api-access-74blq\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.461873 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.461883 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.461891 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.461901 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cfb0727d-7f01-4bdd-9448-96db00678c41-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.461911 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb0727d-7f01-4bdd-9448-96db00678c41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.461918 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfb0727d-7f01-4bdd-9448-96db00678c41-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.582619 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8","Type":"ContainerStarted","Data":"feb162cbf9533d45e263da4954c3f02a63b71ddfaa4d43d8f64549dd37d305e9"} Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.583716 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8","Type":"ContainerStarted","Data":"cfdaa7744aa17718c98b2835033513c500b53353387fb5801b21a1a3763e28eb"} Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.587197 4766 generic.go:334] "Generic (PLEG): container finished" podID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerID="4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57" exitCode=0 Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.587241 4766 generic.go:334] "Generic (PLEG): container finished" podID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerID="a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9" exitCode=143 Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.587263 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfb0727d-7f01-4bdd-9448-96db00678c41","Type":"ContainerDied","Data":"4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57"} Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.587291 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfb0727d-7f01-4bdd-9448-96db00678c41","Type":"ContainerDied","Data":"a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9"} Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.587302 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cfb0727d-7f01-4bdd-9448-96db00678c41","Type":"ContainerDied","Data":"a95f0ef8efa0a87fbf448ae6ce03803a790571930a7f55e0501062321a6b902b"} Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.587317 4766 scope.go:117] "RemoveContainer" containerID="4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.587415 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.630374 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.635741 4766 scope.go:117] "RemoveContainer" containerID="a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.656078 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.670583 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:46:47 crc kubenswrapper[4766]: E1209 04:46:47.671045 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerName="glance-log" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.671060 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerName="glance-log" Dec 09 04:46:47 crc kubenswrapper[4766]: E1209 04:46:47.671090 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerName="glance-httpd" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.671098 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerName="glance-httpd" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.671353 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerName="glance-httpd" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.671381 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb0727d-7f01-4bdd-9448-96db00678c41" containerName="glance-log" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.672510 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.672857 4766 scope.go:117] "RemoveContainer" containerID="4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.675068 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.682279 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:46:47 crc kubenswrapper[4766]: E1209 04:46:47.687566 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57\": container with ID starting with 4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57 not found: ID does not exist" containerID="4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.687614 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57"} err="failed to get container status \"4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57\": rpc error: code = NotFound desc = could not find container \"4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57\": container with ID starting with 4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57 not found: ID does not exist" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.687645 4766 scope.go:117] "RemoveContainer" containerID="a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9" Dec 09 04:46:47 crc kubenswrapper[4766]: E1209 04:46:47.687891 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9\": container with ID starting with a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9 not found: ID does not exist" containerID="a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.687915 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9"} err="failed to get container status \"a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9\": rpc error: code = NotFound desc = could not find container \"a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9\": container with ID starting with a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9 not found: ID does not exist" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.687927 4766 scope.go:117] "RemoveContainer" containerID="4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.688512 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57"} err="failed to get container status \"4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57\": rpc error: code = NotFound desc = could not find container \"4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57\": container with ID starting with 4a9c8921d14eefb5826ee131e615be9ae4968ca2f70769ff3aa1ef29ada8bd57 not found: ID does not exist" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.688533 4766 scope.go:117] "RemoveContainer" containerID="a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.690174 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9"} err="failed to get container status \"a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9\": rpc error: code = NotFound desc = could not find container \"a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9\": container with ID starting with a8a49e00d1d7c40ea45a3f9c05d4429e2b476ecb51655aeaea7b4d29cea746b9 not found: ID does not exist" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.766949 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-logs\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.767063 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.767105 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.767126 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.767238 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.767273 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njh6m\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-kube-api-access-njh6m\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.767325 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.869452 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.869852 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.870143 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.870175 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.870252 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.870276 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njh6m\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-kube-api-access-njh6m\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.870312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.872507 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-logs\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.872901 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-logs\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.876106 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.886963 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.887386 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.887561 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.889923 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njh6m\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-kube-api-access-njh6m\") pod \"glance-default-internal-api-0\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:46:47 crc kubenswrapper[4766]: I1209 04:46:47.999059 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:48 crc kubenswrapper[4766]: I1209 04:46:48.580878 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:46:48 crc kubenswrapper[4766]: I1209 04:46:48.597540 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8","Type":"ContainerStarted","Data":"2fdebba462743f7669b38b4292df1c429a8c0470918a04c8b76e88b1cff1e4c2"} Dec 09 04:46:48 crc kubenswrapper[4766]: I1209 04:46:48.603368 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91eb833a-69ae-4fee-bf91-984f74e2291f","Type":"ContainerStarted","Data":"a841dbf0a094ef80ffd5890868432f0ce330df2a45cf26dc5e7215ef39104915"} Dec 09 04:46:48 crc kubenswrapper[4766]: I1209 04:46:48.629181 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.629167813 podStartE2EDuration="3.629167813s" podCreationTimestamp="2025-12-09 04:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:46:48.618662149 +0000 UTC m=+5690.327967575" watchObservedRunningTime="2025-12-09 04:46:48.629167813 +0000 UTC m=+5690.338473239" Dec 09 04:46:48 crc kubenswrapper[4766]: I1209 04:46:48.863592 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb0727d-7f01-4bdd-9448-96db00678c41" path="/var/lib/kubelet/pods/cfb0727d-7f01-4bdd-9448-96db00678c41/volumes" Dec 09 04:46:49 crc kubenswrapper[4766]: I1209 04:46:49.613567 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91eb833a-69ae-4fee-bf91-984f74e2291f","Type":"ContainerStarted","Data":"2f369150b7ee568cdfe54ed4396c15d5014c11103a0b1a2645fb7f8910e33faf"} Dec 09 04:46:49 crc kubenswrapper[4766]: I1209 04:46:49.613948 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91eb833a-69ae-4fee-bf91-984f74e2291f","Type":"ContainerStarted","Data":"64e3d278b9e3fbd760a419c198ca16e7fe4c0ea743181c33fcb6e1c33cc44a00"} Dec 09 04:46:49 crc kubenswrapper[4766]: I1209 04:46:49.839579 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:46:49 crc kubenswrapper[4766]: E1209 04:46:49.839907 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.364504 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.386021 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.386001319 podStartE2EDuration="5.386001319s" podCreationTimestamp="2025-12-09 04:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:46:49.64072208 +0000 UTC m=+5691.350027506" watchObservedRunningTime="2025-12-09 04:46:52.386001319 +0000 UTC m=+5694.095306765" Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.416684 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-964d6fb57-5fj6k"] Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.416969 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" podUID="f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" containerName="dnsmasq-dns" containerID="cri-o://702c030d30e9db08afaffe505e04964481715caa91e7da25b90fb75f51b20e69" gracePeriod=10 Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.647388 4766 generic.go:334] "Generic (PLEG): container finished" podID="f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" containerID="702c030d30e9db08afaffe505e04964481715caa91e7da25b90fb75f51b20e69" exitCode=0 Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.647438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" event={"ID":"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52","Type":"ContainerDied","Data":"702c030d30e9db08afaffe505e04964481715caa91e7da25b90fb75f51b20e69"} Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.882701 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.980603 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-config\") pod \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.980671 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-sb\") pod \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.980771 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-dns-svc\") pod \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.980808 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-nb\") pod \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.980842 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwnxt\" (UniqueName: \"kubernetes.io/projected/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-kube-api-access-pwnxt\") pod \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\" (UID: \"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52\") " Dec 09 04:46:52 crc kubenswrapper[4766]: I1209 04:46:52.986244 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-kube-api-access-pwnxt" (OuterVolumeSpecName: "kube-api-access-pwnxt") pod "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" (UID: "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52"). InnerVolumeSpecName "kube-api-access-pwnxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.022881 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" (UID: "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.023601 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" (UID: "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.034319 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-config" (OuterVolumeSpecName: "config") pod "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" (UID: "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.046464 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" (UID: "f5ae1fc2-f3db-41a7-bc26-d3e71eceff52"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.082877 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.083167 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.083254 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.083316 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.083376 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwnxt\" (UniqueName: \"kubernetes.io/projected/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52-kube-api-access-pwnxt\") on node \"crc\" DevicePath \"\"" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.679714 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" event={"ID":"f5ae1fc2-f3db-41a7-bc26-d3e71eceff52","Type":"ContainerDied","Data":"1f1312b0f891be9c9ea5548e1b229e7b69a245a5c79ac8eb7a7f67d5fb0caee2"} Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.679794 4766 scope.go:117] "RemoveContainer" containerID="702c030d30e9db08afaffe505e04964481715caa91e7da25b90fb75f51b20e69" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.680507 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-964d6fb57-5fj6k" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.710264 4766 scope.go:117] "RemoveContainer" containerID="33c5cc23d35a4934ce510e0413547e615b6bb9ab37218a8a3bf3959865ebded5" Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.724674 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-964d6fb57-5fj6k"] Dec 09 04:46:53 crc kubenswrapper[4766]: I1209 04:46:53.732016 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-964d6fb57-5fj6k"] Dec 09 04:46:54 crc kubenswrapper[4766]: I1209 04:46:54.859087 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" path="/var/lib/kubelet/pods/f5ae1fc2-f3db-41a7-bc26-d3e71eceff52/volumes" Dec 09 04:46:56 crc kubenswrapper[4766]: I1209 04:46:56.240107 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 04:46:56 crc kubenswrapper[4766]: I1209 04:46:56.240180 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 04:46:56 crc kubenswrapper[4766]: I1209 04:46:56.289080 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 04:46:56 crc kubenswrapper[4766]: I1209 04:46:56.305057 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 04:46:56 crc kubenswrapper[4766]: I1209 04:46:56.716595 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 04:46:56 crc kubenswrapper[4766]: I1209 04:46:56.716667 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 04:46:57 crc kubenswrapper[4766]: I1209 04:46:57.999307 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:58 crc kubenswrapper[4766]: I1209 04:46:58.000465 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:58 crc kubenswrapper[4766]: I1209 04:46:58.028399 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:58 crc kubenswrapper[4766]: I1209 04:46:58.056905 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:58 crc kubenswrapper[4766]: I1209 04:46:58.691794 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 04:46:58 crc kubenswrapper[4766]: I1209 04:46:58.691844 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 04:46:58 crc kubenswrapper[4766]: I1209 04:46:58.734472 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 04:46:58 crc kubenswrapper[4766]: I1209 04:46:58.734521 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 04:47:00 crc kubenswrapper[4766]: I1209 04:47:00.704779 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 04:47:00 crc kubenswrapper[4766]: I1209 04:47:00.751049 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 04:47:00 crc kubenswrapper[4766]: I1209 04:47:00.773411 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 04:47:04 crc kubenswrapper[4766]: I1209 04:47:04.839797 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:47:04 crc kubenswrapper[4766]: E1209 04:47:04.840565 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:47:06 crc kubenswrapper[4766]: I1209 04:47:06.969016 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5pk2z"] Dec 09 04:47:06 crc kubenswrapper[4766]: E1209 04:47:06.969747 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" containerName="init" Dec 09 04:47:06 crc kubenswrapper[4766]: I1209 04:47:06.969764 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" containerName="init" Dec 09 04:47:06 crc kubenswrapper[4766]: E1209 04:47:06.969790 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" containerName="dnsmasq-dns" Dec 09 04:47:06 crc kubenswrapper[4766]: I1209 04:47:06.969797 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" containerName="dnsmasq-dns" Dec 09 04:47:06 crc kubenswrapper[4766]: I1209 04:47:06.970005 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ae1fc2-f3db-41a7-bc26-d3e71eceff52" containerName="dnsmasq-dns" Dec 09 04:47:06 crc kubenswrapper[4766]: I1209 04:47:06.970726 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:06 crc kubenswrapper[4766]: I1209 04:47:06.981440 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5pk2z"] Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.058000 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5abc385-6494-4563-968a-124a07d7b1e9-operator-scripts\") pod \"placement-db-create-5pk2z\" (UID: \"a5abc385-6494-4563-968a-124a07d7b1e9\") " pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.058061 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5g7x\" (UniqueName: \"kubernetes.io/projected/a5abc385-6494-4563-968a-124a07d7b1e9-kube-api-access-d5g7x\") pod \"placement-db-create-5pk2z\" (UID: \"a5abc385-6494-4563-968a-124a07d7b1e9\") " pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.090364 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ce1d-account-create-update-5wqcp"] Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.092008 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.098425 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.110439 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ce1d-account-create-update-5wqcp"] Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.159776 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5abc385-6494-4563-968a-124a07d7b1e9-operator-scripts\") pod \"placement-db-create-5pk2z\" (UID: \"a5abc385-6494-4563-968a-124a07d7b1e9\") " pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.159823 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5g7x\" (UniqueName: \"kubernetes.io/projected/a5abc385-6494-4563-968a-124a07d7b1e9-kube-api-access-d5g7x\") pod \"placement-db-create-5pk2z\" (UID: \"a5abc385-6494-4563-968a-124a07d7b1e9\") " pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.162172 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5abc385-6494-4563-968a-124a07d7b1e9-operator-scripts\") pod \"placement-db-create-5pk2z\" (UID: \"a5abc385-6494-4563-968a-124a07d7b1e9\") " pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.179847 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5g7x\" (UniqueName: \"kubernetes.io/projected/a5abc385-6494-4563-968a-124a07d7b1e9-kube-api-access-d5g7x\") pod \"placement-db-create-5pk2z\" (UID: \"a5abc385-6494-4563-968a-124a07d7b1e9\") " pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.262031 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4710023f-0f31-45ac-b1bb-f6dd3b84704a-operator-scripts\") pod \"placement-ce1d-account-create-update-5wqcp\" (UID: \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\") " pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.262242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfc45\" (UniqueName: \"kubernetes.io/projected/4710023f-0f31-45ac-b1bb-f6dd3b84704a-kube-api-access-cfc45\") pod \"placement-ce1d-account-create-update-5wqcp\" (UID: \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\") " pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.292045 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.363832 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4710023f-0f31-45ac-b1bb-f6dd3b84704a-operator-scripts\") pod \"placement-ce1d-account-create-update-5wqcp\" (UID: \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\") " pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.364312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfc45\" (UniqueName: \"kubernetes.io/projected/4710023f-0f31-45ac-b1bb-f6dd3b84704a-kube-api-access-cfc45\") pod \"placement-ce1d-account-create-update-5wqcp\" (UID: \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\") " pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.365156 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4710023f-0f31-45ac-b1bb-f6dd3b84704a-operator-scripts\") pod \"placement-ce1d-account-create-update-5wqcp\" (UID: \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\") " pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.386039 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfc45\" (UniqueName: \"kubernetes.io/projected/4710023f-0f31-45ac-b1bb-f6dd3b84704a-kube-api-access-cfc45\") pod \"placement-ce1d-account-create-update-5wqcp\" (UID: \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\") " pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.431881 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.814387 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5pk2z"] Dec 09 04:47:07 crc kubenswrapper[4766]: W1209 04:47:07.818205 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5abc385_6494_4563_968a_124a07d7b1e9.slice/crio-346c30c1b7206d0b982664c688da98012c882617bd85e4ba659685d2d0b3c7da WatchSource:0}: Error finding container 346c30c1b7206d0b982664c688da98012c882617bd85e4ba659685d2d0b3c7da: Status 404 returned error can't find the container with id 346c30c1b7206d0b982664c688da98012c882617bd85e4ba659685d2d0b3c7da Dec 09 04:47:07 crc kubenswrapper[4766]: I1209 04:47:07.895877 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ce1d-account-create-update-5wqcp"] Dec 09 04:47:07 crc kubenswrapper[4766]: W1209 04:47:07.911834 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4710023f_0f31_45ac_b1bb_f6dd3b84704a.slice/crio-63dcc711cd9bd291c739b1399374262c944143ce05d0d2644ecd2cfbff6f65cf WatchSource:0}: Error finding container 63dcc711cd9bd291c739b1399374262c944143ce05d0d2644ecd2cfbff6f65cf: Status 404 returned error can't find the container with id 63dcc711cd9bd291c739b1399374262c944143ce05d0d2644ecd2cfbff6f65cf Dec 09 04:47:08 crc kubenswrapper[4766]: I1209 04:47:08.835407 4766 generic.go:334] "Generic (PLEG): container finished" podID="4710023f-0f31-45ac-b1bb-f6dd3b84704a" containerID="87363e8fe0febdcef5dc8f33554b29a6a4781810e413f9fe4e899fb0962bf53f" exitCode=0 Dec 09 04:47:08 crc kubenswrapper[4766]: I1209 04:47:08.835465 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ce1d-account-create-update-5wqcp" event={"ID":"4710023f-0f31-45ac-b1bb-f6dd3b84704a","Type":"ContainerDied","Data":"87363e8fe0febdcef5dc8f33554b29a6a4781810e413f9fe4e899fb0962bf53f"} Dec 09 04:47:08 crc kubenswrapper[4766]: I1209 04:47:08.835925 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ce1d-account-create-update-5wqcp" event={"ID":"4710023f-0f31-45ac-b1bb-f6dd3b84704a","Type":"ContainerStarted","Data":"63dcc711cd9bd291c739b1399374262c944143ce05d0d2644ecd2cfbff6f65cf"} Dec 09 04:47:08 crc kubenswrapper[4766]: I1209 04:47:08.838485 4766 generic.go:334] "Generic (PLEG): container finished" podID="a5abc385-6494-4563-968a-124a07d7b1e9" containerID="fdc245bf65be30f0eab7a2e26fcfe3455a04d5c788a2e88ef8b4a8ad9ffc20a9" exitCode=0 Dec 09 04:47:08 crc kubenswrapper[4766]: I1209 04:47:08.853001 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5pk2z" event={"ID":"a5abc385-6494-4563-968a-124a07d7b1e9","Type":"ContainerDied","Data":"fdc245bf65be30f0eab7a2e26fcfe3455a04d5c788a2e88ef8b4a8ad9ffc20a9"} Dec 09 04:47:08 crc kubenswrapper[4766]: I1209 04:47:08.853077 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5pk2z" event={"ID":"a5abc385-6494-4563-968a-124a07d7b1e9","Type":"ContainerStarted","Data":"346c30c1b7206d0b982664c688da98012c882617bd85e4ba659685d2d0b3c7da"} Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.291803 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.297753 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.421267 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfc45\" (UniqueName: \"kubernetes.io/projected/4710023f-0f31-45ac-b1bb-f6dd3b84704a-kube-api-access-cfc45\") pod \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\" (UID: \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\") " Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.421343 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4710023f-0f31-45ac-b1bb-f6dd3b84704a-operator-scripts\") pod \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\" (UID: \"4710023f-0f31-45ac-b1bb-f6dd3b84704a\") " Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.421561 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5g7x\" (UniqueName: \"kubernetes.io/projected/a5abc385-6494-4563-968a-124a07d7b1e9-kube-api-access-d5g7x\") pod \"a5abc385-6494-4563-968a-124a07d7b1e9\" (UID: \"a5abc385-6494-4563-968a-124a07d7b1e9\") " Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.421614 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5abc385-6494-4563-968a-124a07d7b1e9-operator-scripts\") pod \"a5abc385-6494-4563-968a-124a07d7b1e9\" (UID: \"a5abc385-6494-4563-968a-124a07d7b1e9\") " Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.422404 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4710023f-0f31-45ac-b1bb-f6dd3b84704a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4710023f-0f31-45ac-b1bb-f6dd3b84704a" (UID: "4710023f-0f31-45ac-b1bb-f6dd3b84704a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.422815 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5abc385-6494-4563-968a-124a07d7b1e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5abc385-6494-4563-968a-124a07d7b1e9" (UID: "a5abc385-6494-4563-968a-124a07d7b1e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.429503 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4710023f-0f31-45ac-b1bb-f6dd3b84704a-kube-api-access-cfc45" (OuterVolumeSpecName: "kube-api-access-cfc45") pod "4710023f-0f31-45ac-b1bb-f6dd3b84704a" (UID: "4710023f-0f31-45ac-b1bb-f6dd3b84704a"). InnerVolumeSpecName "kube-api-access-cfc45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.433579 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5abc385-6494-4563-968a-124a07d7b1e9-kube-api-access-d5g7x" (OuterVolumeSpecName: "kube-api-access-d5g7x") pod "a5abc385-6494-4563-968a-124a07d7b1e9" (UID: "a5abc385-6494-4563-968a-124a07d7b1e9"). InnerVolumeSpecName "kube-api-access-d5g7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.525119 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfc45\" (UniqueName: \"kubernetes.io/projected/4710023f-0f31-45ac-b1bb-f6dd3b84704a-kube-api-access-cfc45\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.525209 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4710023f-0f31-45ac-b1bb-f6dd3b84704a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.525265 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5g7x\" (UniqueName: \"kubernetes.io/projected/a5abc385-6494-4563-968a-124a07d7b1e9-kube-api-access-d5g7x\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.525288 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5abc385-6494-4563-968a-124a07d7b1e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.864649 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ce1d-account-create-update-5wqcp" event={"ID":"4710023f-0f31-45ac-b1bb-f6dd3b84704a","Type":"ContainerDied","Data":"63dcc711cd9bd291c739b1399374262c944143ce05d0d2644ecd2cfbff6f65cf"} Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.864727 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63dcc711cd9bd291c739b1399374262c944143ce05d0d2644ecd2cfbff6f65cf" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.864831 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ce1d-account-create-update-5wqcp" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.867516 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5pk2z" event={"ID":"a5abc385-6494-4563-968a-124a07d7b1e9","Type":"ContainerDied","Data":"346c30c1b7206d0b982664c688da98012c882617bd85e4ba659685d2d0b3c7da"} Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.867574 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="346c30c1b7206d0b982664c688da98012c882617bd85e4ba659685d2d0b3c7da" Dec 09 04:47:10 crc kubenswrapper[4766]: I1209 04:47:10.867979 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5pk2z" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.317529 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-664dd56477-qp45x"] Dec 09 04:47:12 crc kubenswrapper[4766]: E1209 04:47:12.318087 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5abc385-6494-4563-968a-124a07d7b1e9" containerName="mariadb-database-create" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.318098 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5abc385-6494-4563-968a-124a07d7b1e9" containerName="mariadb-database-create" Dec 09 04:47:12 crc kubenswrapper[4766]: E1209 04:47:12.318128 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4710023f-0f31-45ac-b1bb-f6dd3b84704a" containerName="mariadb-account-create-update" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.318134 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4710023f-0f31-45ac-b1bb-f6dd3b84704a" containerName="mariadb-account-create-update" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.318292 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5abc385-6494-4563-968a-124a07d7b1e9" containerName="mariadb-database-create" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.318313 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4710023f-0f31-45ac-b1bb-f6dd3b84704a" containerName="mariadb-account-create-update" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.320314 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.327896 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-664dd56477-qp45x"] Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.361929 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2c8w2"] Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.364689 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.367759 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.368034 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4ztrz" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.368237 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.396550 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2c8w2"] Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.476484 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-config-data\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.476534 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-sb\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.476563 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhjf\" (UniqueName: \"kubernetes.io/projected/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-kube-api-access-vxhjf\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.476581 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-nb\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.476603 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-config\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.476631 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-scripts\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.476647 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-dns-svc\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.477159 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdvw\" (UniqueName: \"kubernetes.io/projected/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-kube-api-access-zfdvw\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.477237 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-logs\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.477309 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-combined-ca-bundle\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.578683 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-combined-ca-bundle\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.578772 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-config-data\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.578804 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-sb\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.578835 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxhjf\" (UniqueName: \"kubernetes.io/projected/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-kube-api-access-vxhjf\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.578859 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-nb\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.578888 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-config\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.579886 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-nb\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.579905 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-config\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.578920 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-scripts\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.579994 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-dns-svc\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.580082 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdvw\" (UniqueName: \"kubernetes.io/projected/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-kube-api-access-zfdvw\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.580136 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-logs\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.580295 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-sb\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.580605 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-logs\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.580853 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-dns-svc\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.584168 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-combined-ca-bundle\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.588626 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-scripts\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.590948 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-config-data\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.602246 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdvw\" (UniqueName: \"kubernetes.io/projected/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-kube-api-access-zfdvw\") pod \"dnsmasq-dns-664dd56477-qp45x\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.605409 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxhjf\" (UniqueName: \"kubernetes.io/projected/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-kube-api-access-vxhjf\") pod \"placement-db-sync-2c8w2\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.639608 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:12 crc kubenswrapper[4766]: I1209 04:47:12.695586 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:13 crc kubenswrapper[4766]: I1209 04:47:13.079652 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-664dd56477-qp45x"] Dec 09 04:47:13 crc kubenswrapper[4766]: I1209 04:47:13.176341 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2c8w2"] Dec 09 04:47:13 crc kubenswrapper[4766]: W1209 04:47:13.188655 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2c888aa_7ec3_4e63_8d79_dc668fd0d5e4.slice/crio-e4eab3f90179cab6a91c0cff4059ba3e0a27653ef6155bf3ba3b4655a6bfca54 WatchSource:0}: Error finding container e4eab3f90179cab6a91c0cff4059ba3e0a27653ef6155bf3ba3b4655a6bfca54: Status 404 returned error can't find the container with id e4eab3f90179cab6a91c0cff4059ba3e0a27653ef6155bf3ba3b4655a6bfca54 Dec 09 04:47:13 crc kubenswrapper[4766]: I1209 04:47:13.905410 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2c8w2" event={"ID":"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4","Type":"ContainerStarted","Data":"ac46f9d70c6436d0e62a8804cf61fc36234c3e971c7f31a576ef218d3a0903b5"} Dec 09 04:47:13 crc kubenswrapper[4766]: I1209 04:47:13.906786 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2c8w2" event={"ID":"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4","Type":"ContainerStarted","Data":"e4eab3f90179cab6a91c0cff4059ba3e0a27653ef6155bf3ba3b4655a6bfca54"} Dec 09 04:47:13 crc kubenswrapper[4766]: I1209 04:47:13.909524 4766 generic.go:334] "Generic (PLEG): container finished" podID="60a38c6f-bc51-42ac-84f0-0ad2dda1c886" containerID="00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d" exitCode=0 Dec 09 04:47:13 crc kubenswrapper[4766]: I1209 04:47:13.909568 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664dd56477-qp45x" event={"ID":"60a38c6f-bc51-42ac-84f0-0ad2dda1c886","Type":"ContainerDied","Data":"00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d"} Dec 09 04:47:13 crc kubenswrapper[4766]: I1209 04:47:13.909588 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664dd56477-qp45x" event={"ID":"60a38c6f-bc51-42ac-84f0-0ad2dda1c886","Type":"ContainerStarted","Data":"0f931cecffed7e1332881c664ee59df8f65c831efbc712851ece5b42aea4904f"} Dec 09 04:47:13 crc kubenswrapper[4766]: I1209 04:47:13.935035 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2c8w2" podStartSLOduration=1.935020598 podStartE2EDuration="1.935020598s" podCreationTimestamp="2025-12-09 04:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:47:13.933397473 +0000 UTC m=+5715.642702889" watchObservedRunningTime="2025-12-09 04:47:13.935020598 +0000 UTC m=+5715.644326024" Dec 09 04:47:14 crc kubenswrapper[4766]: I1209 04:47:14.920388 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664dd56477-qp45x" event={"ID":"60a38c6f-bc51-42ac-84f0-0ad2dda1c886","Type":"ContainerStarted","Data":"cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161"} Dec 09 04:47:14 crc kubenswrapper[4766]: I1209 04:47:14.922343 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:14 crc kubenswrapper[4766]: I1209 04:47:14.925743 4766 generic.go:334] "Generic (PLEG): container finished" podID="c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" containerID="ac46f9d70c6436d0e62a8804cf61fc36234c3e971c7f31a576ef218d3a0903b5" exitCode=0 Dec 09 04:47:14 crc kubenswrapper[4766]: I1209 04:47:14.925824 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2c8w2" event={"ID":"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4","Type":"ContainerDied","Data":"ac46f9d70c6436d0e62a8804cf61fc36234c3e971c7f31a576ef218d3a0903b5"} Dec 09 04:47:14 crc kubenswrapper[4766]: I1209 04:47:14.950771 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-664dd56477-qp45x" podStartSLOduration=2.950742747 podStartE2EDuration="2.950742747s" podCreationTimestamp="2025-12-09 04:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:47:14.947650744 +0000 UTC m=+5716.656956170" watchObservedRunningTime="2025-12-09 04:47:14.950742747 +0000 UTC m=+5716.660048183" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.287629 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.477420 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxhjf\" (UniqueName: \"kubernetes.io/projected/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-kube-api-access-vxhjf\") pod \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.477739 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-logs\") pod \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.477772 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-config-data\") pod \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.477824 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-combined-ca-bundle\") pod \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.477882 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-scripts\") pod \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\" (UID: \"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4\") " Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.478717 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-logs" (OuterVolumeSpecName: "logs") pod "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" (UID: "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.483101 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-scripts" (OuterVolumeSpecName: "scripts") pod "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" (UID: "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.483617 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-kube-api-access-vxhjf" (OuterVolumeSpecName: "kube-api-access-vxhjf") pod "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" (UID: "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4"). InnerVolumeSpecName "kube-api-access-vxhjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.503298 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" (UID: "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.505460 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-config-data" (OuterVolumeSpecName: "config-data") pod "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" (UID: "c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.580554 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxhjf\" (UniqueName: \"kubernetes.io/projected/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-kube-api-access-vxhjf\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.580611 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.580632 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.580660 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.580678 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.943407 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2c8w2" event={"ID":"c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4","Type":"ContainerDied","Data":"e4eab3f90179cab6a91c0cff4059ba3e0a27653ef6155bf3ba3b4655a6bfca54"} Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.943447 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4eab3f90179cab6a91c0cff4059ba3e0a27653ef6155bf3ba3b4655a6bfca54" Dec 09 04:47:16 crc kubenswrapper[4766]: I1209 04:47:16.943524 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2c8w2" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.036847 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bf6488d8b-5q42s"] Dec 09 04:47:17 crc kubenswrapper[4766]: E1209 04:47:17.038372 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" containerName="placement-db-sync" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.038407 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" containerName="placement-db-sync" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.038710 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" containerName="placement-db-sync" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.042520 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.044285 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.045952 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.046012 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4ztrz" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.050778 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bf6488d8b-5q42s"] Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.090804 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvtz8\" (UniqueName: \"kubernetes.io/projected/246c2111-735d-43b7-bb05-a0566897889e-kube-api-access-wvtz8\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.090909 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246c2111-735d-43b7-bb05-a0566897889e-config-data\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.090986 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246c2111-735d-43b7-bb05-a0566897889e-scripts\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.091081 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246c2111-735d-43b7-bb05-a0566897889e-logs\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.091329 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246c2111-735d-43b7-bb05-a0566897889e-combined-ca-bundle\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.193287 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvtz8\" (UniqueName: \"kubernetes.io/projected/246c2111-735d-43b7-bb05-a0566897889e-kube-api-access-wvtz8\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.193700 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246c2111-735d-43b7-bb05-a0566897889e-config-data\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.194373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246c2111-735d-43b7-bb05-a0566897889e-scripts\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.194423 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246c2111-735d-43b7-bb05-a0566897889e-logs\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.194493 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246c2111-735d-43b7-bb05-a0566897889e-combined-ca-bundle\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.195075 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246c2111-735d-43b7-bb05-a0566897889e-logs\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.198147 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246c2111-735d-43b7-bb05-a0566897889e-combined-ca-bundle\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.198667 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246c2111-735d-43b7-bb05-a0566897889e-config-data\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.205662 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246c2111-735d-43b7-bb05-a0566897889e-scripts\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.213492 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvtz8\" (UniqueName: \"kubernetes.io/projected/246c2111-735d-43b7-bb05-a0566897889e-kube-api-access-wvtz8\") pod \"placement-bf6488d8b-5q42s\" (UID: \"246c2111-735d-43b7-bb05-a0566897889e\") " pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.374751 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:17 crc kubenswrapper[4766]: W1209 04:47:17.791347 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod246c2111_735d_43b7_bb05_a0566897889e.slice/crio-1d35794d6520cefe743f41976018dc7f15717dadb6cc2eba262b3d02b19d197c WatchSource:0}: Error finding container 1d35794d6520cefe743f41976018dc7f15717dadb6cc2eba262b3d02b19d197c: Status 404 returned error can't find the container with id 1d35794d6520cefe743f41976018dc7f15717dadb6cc2eba262b3d02b19d197c Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.795519 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bf6488d8b-5q42s"] Dec 09 04:47:17 crc kubenswrapper[4766]: I1209 04:47:17.952051 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bf6488d8b-5q42s" event={"ID":"246c2111-735d-43b7-bb05-a0566897889e","Type":"ContainerStarted","Data":"1d35794d6520cefe743f41976018dc7f15717dadb6cc2eba262b3d02b19d197c"} Dec 09 04:47:18 crc kubenswrapper[4766]: I1209 04:47:18.846997 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:47:18 crc kubenswrapper[4766]: E1209 04:47:18.848202 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:47:18 crc kubenswrapper[4766]: I1209 04:47:18.960417 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bf6488d8b-5q42s" event={"ID":"246c2111-735d-43b7-bb05-a0566897889e","Type":"ContainerStarted","Data":"9178277c71e5b8a32b80c1af16c5b794d73541736a309615c9e30c6dc8a45c77"} Dec 09 04:47:18 crc kubenswrapper[4766]: I1209 04:47:18.960454 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bf6488d8b-5q42s" event={"ID":"246c2111-735d-43b7-bb05-a0566897889e","Type":"ContainerStarted","Data":"b25f669f88dcf1f16546955776f73879bc494cecae90ec8556bdc4fff377f515"} Dec 09 04:47:18 crc kubenswrapper[4766]: I1209 04:47:18.960536 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:18 crc kubenswrapper[4766]: I1209 04:47:18.960564 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:18 crc kubenswrapper[4766]: I1209 04:47:18.981469 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-bf6488d8b-5q42s" podStartSLOduration=1.981449097 podStartE2EDuration="1.981449097s" podCreationTimestamp="2025-12-09 04:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:47:18.976805851 +0000 UTC m=+5720.686111277" watchObservedRunningTime="2025-12-09 04:47:18.981449097 +0000 UTC m=+5720.690754523" Dec 09 04:47:22 crc kubenswrapper[4766]: I1209 04:47:22.641398 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:47:22 crc kubenswrapper[4766]: I1209 04:47:22.727555 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56ffb59c77-b4j2r"] Dec 09 04:47:22 crc kubenswrapper[4766]: I1209 04:47:22.728011 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" podUID="b3dd023b-bcc3-4f74-b1ab-f29801cd7643" containerName="dnsmasq-dns" containerID="cri-o://686e1f6ad644303c5903b9d33cc07198eca6f83b86deb6d9e18c67f456882670" gracePeriod=10 Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.009501 4766 generic.go:334] "Generic (PLEG): container finished" podID="b3dd023b-bcc3-4f74-b1ab-f29801cd7643" containerID="686e1f6ad644303c5903b9d33cc07198eca6f83b86deb6d9e18c67f456882670" exitCode=0 Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.009553 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" event={"ID":"b3dd023b-bcc3-4f74-b1ab-f29801cd7643","Type":"ContainerDied","Data":"686e1f6ad644303c5903b9d33cc07198eca6f83b86deb6d9e18c67f456882670"} Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.193920 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.300542 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-sb\") pod \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.300652 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-dns-svc\") pod \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.300689 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7dh4\" (UniqueName: \"kubernetes.io/projected/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-kube-api-access-d7dh4\") pod \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.300762 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-config\") pod \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.301103 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-nb\") pod \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\" (UID: \"b3dd023b-bcc3-4f74-b1ab-f29801cd7643\") " Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.317428 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-kube-api-access-d7dh4" (OuterVolumeSpecName: "kube-api-access-d7dh4") pod "b3dd023b-bcc3-4f74-b1ab-f29801cd7643" (UID: "b3dd023b-bcc3-4f74-b1ab-f29801cd7643"). InnerVolumeSpecName "kube-api-access-d7dh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.350749 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3dd023b-bcc3-4f74-b1ab-f29801cd7643" (UID: "b3dd023b-bcc3-4f74-b1ab-f29801cd7643"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.352124 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3dd023b-bcc3-4f74-b1ab-f29801cd7643" (UID: "b3dd023b-bcc3-4f74-b1ab-f29801cd7643"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.367244 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3dd023b-bcc3-4f74-b1ab-f29801cd7643" (UID: "b3dd023b-bcc3-4f74-b1ab-f29801cd7643"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.368958 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-config" (OuterVolumeSpecName: "config") pod "b3dd023b-bcc3-4f74-b1ab-f29801cd7643" (UID: "b3dd023b-bcc3-4f74-b1ab-f29801cd7643"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.402895 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7dh4\" (UniqueName: \"kubernetes.io/projected/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-kube-api-access-d7dh4\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.402932 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.402941 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.402949 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:23 crc kubenswrapper[4766]: I1209 04:47:23.402957 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3dd023b-bcc3-4f74-b1ab-f29801cd7643-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:47:24 crc kubenswrapper[4766]: I1209 04:47:24.019907 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" event={"ID":"b3dd023b-bcc3-4f74-b1ab-f29801cd7643","Type":"ContainerDied","Data":"69c07bdf56466ceade4c58603647a3b88a82eb889619b5bb2e2ca85f38483614"} Dec 09 04:47:24 crc kubenswrapper[4766]: I1209 04:47:24.019961 4766 scope.go:117] "RemoveContainer" containerID="686e1f6ad644303c5903b9d33cc07198eca6f83b86deb6d9e18c67f456882670" Dec 09 04:47:24 crc kubenswrapper[4766]: I1209 04:47:24.019998 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56ffb59c77-b4j2r" Dec 09 04:47:24 crc kubenswrapper[4766]: I1209 04:47:24.045533 4766 scope.go:117] "RemoveContainer" containerID="150fad522e4623716a6f75420d45933346130ba8caaada3df9463c8aa10cbac1" Dec 09 04:47:24 crc kubenswrapper[4766]: I1209 04:47:24.064739 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56ffb59c77-b4j2r"] Dec 09 04:47:24 crc kubenswrapper[4766]: I1209 04:47:24.071985 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56ffb59c77-b4j2r"] Dec 09 04:47:24 crc kubenswrapper[4766]: I1209 04:47:24.853837 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3dd023b-bcc3-4f74-b1ab-f29801cd7643" path="/var/lib/kubelet/pods/b3dd023b-bcc3-4f74-b1ab-f29801cd7643/volumes" Dec 09 04:47:32 crc kubenswrapper[4766]: I1209 04:47:32.839554 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:47:32 crc kubenswrapper[4766]: E1209 04:47:32.840245 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:47:45 crc kubenswrapper[4766]: I1209 04:47:45.839406 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:47:45 crc kubenswrapper[4766]: E1209 04:47:45.840441 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:47:48 crc kubenswrapper[4766]: I1209 04:47:48.366118 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:48 crc kubenswrapper[4766]: I1209 04:47:48.378175 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-bf6488d8b-5q42s" Dec 09 04:47:57 crc kubenswrapper[4766]: I1209 04:47:57.840127 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:47:57 crc kubenswrapper[4766]: E1209 04:47:57.841237 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.244574 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-n4llq"] Dec 09 04:48:10 crc kubenswrapper[4766]: E1209 04:48:10.245607 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3dd023b-bcc3-4f74-b1ab-f29801cd7643" containerName="init" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.245626 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3dd023b-bcc3-4f74-b1ab-f29801cd7643" containerName="init" Dec 09 04:48:10 crc kubenswrapper[4766]: E1209 04:48:10.245662 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3dd023b-bcc3-4f74-b1ab-f29801cd7643" containerName="dnsmasq-dns" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.245670 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3dd023b-bcc3-4f74-b1ab-f29801cd7643" containerName="dnsmasq-dns" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.245910 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3dd023b-bcc3-4f74-b1ab-f29801cd7643" containerName="dnsmasq-dns" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.246686 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.258763 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-n4llq"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.339025 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qnm8w"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.340886 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.357302 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qnm8w"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.413123 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckrpr\" (UniqueName: \"kubernetes.io/projected/6005b1e9-bb74-426c-ae88-061a16303fdf-kube-api-access-ckrpr\") pod \"nova-api-db-create-n4llq\" (UID: \"6005b1e9-bb74-426c-ae88-061a16303fdf\") " pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.413172 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6005b1e9-bb74-426c-ae88-061a16303fdf-operator-scripts\") pod \"nova-api-db-create-n4llq\" (UID: \"6005b1e9-bb74-426c-ae88-061a16303fdf\") " pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.413203 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klbg6\" (UniqueName: \"kubernetes.io/projected/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-kube-api-access-klbg6\") pod \"nova-cell0-db-create-qnm8w\" (UID: \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\") " pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.413293 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-operator-scripts\") pod \"nova-cell0-db-create-qnm8w\" (UID: \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\") " pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.460938 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b4cc-account-create-update-c7wqv"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.462443 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.467767 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b4cc-account-create-update-c7wqv"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.468187 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.515132 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckrpr\" (UniqueName: \"kubernetes.io/projected/6005b1e9-bb74-426c-ae88-061a16303fdf-kube-api-access-ckrpr\") pod \"nova-api-db-create-n4llq\" (UID: \"6005b1e9-bb74-426c-ae88-061a16303fdf\") " pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.515187 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6005b1e9-bb74-426c-ae88-061a16303fdf-operator-scripts\") pod \"nova-api-db-create-n4llq\" (UID: \"6005b1e9-bb74-426c-ae88-061a16303fdf\") " pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.515294 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klbg6\" (UniqueName: \"kubernetes.io/projected/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-kube-api-access-klbg6\") pod \"nova-cell0-db-create-qnm8w\" (UID: \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\") " pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.515325 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-operator-scripts\") pod \"nova-cell0-db-create-qnm8w\" (UID: \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\") " pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.516715 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-operator-scripts\") pod \"nova-cell0-db-create-qnm8w\" (UID: \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\") " pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.516791 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6005b1e9-bb74-426c-ae88-061a16303fdf-operator-scripts\") pod \"nova-api-db-create-n4llq\" (UID: \"6005b1e9-bb74-426c-ae88-061a16303fdf\") " pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.534614 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckrpr\" (UniqueName: \"kubernetes.io/projected/6005b1e9-bb74-426c-ae88-061a16303fdf-kube-api-access-ckrpr\") pod \"nova-api-db-create-n4llq\" (UID: \"6005b1e9-bb74-426c-ae88-061a16303fdf\") " pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.536490 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klbg6\" (UniqueName: \"kubernetes.io/projected/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-kube-api-access-klbg6\") pod \"nova-cell0-db-create-qnm8w\" (UID: \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\") " pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.558596 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-9bpvj"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.559988 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.566638 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9bpvj"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.569648 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.616803 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-operator-scripts\") pod \"nova-api-b4cc-account-create-update-c7wqv\" (UID: \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\") " pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.616860 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn2dn\" (UniqueName: \"kubernetes.io/projected/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-kube-api-access-fn2dn\") pod \"nova-api-b4cc-account-create-update-c7wqv\" (UID: \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\") " pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.662357 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d0a7-account-create-update-zgx2j"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.663715 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.664303 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.665507 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.682935 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d0a7-account-create-update-zgx2j"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.719827 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx5k2\" (UniqueName: \"kubernetes.io/projected/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-kube-api-access-vx5k2\") pod \"nova-cell1-db-create-9bpvj\" (UID: \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\") " pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.720660 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-operator-scripts\") pod \"nova-api-b4cc-account-create-update-c7wqv\" (UID: \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\") " pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.720747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn2dn\" (UniqueName: \"kubernetes.io/projected/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-kube-api-access-fn2dn\") pod \"nova-api-b4cc-account-create-update-c7wqv\" (UID: \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\") " pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.720861 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-operator-scripts\") pod \"nova-cell1-db-create-9bpvj\" (UID: \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\") " pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.722946 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-operator-scripts\") pod \"nova-api-b4cc-account-create-update-c7wqv\" (UID: \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\") " pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.750710 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn2dn\" (UniqueName: \"kubernetes.io/projected/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-kube-api-access-fn2dn\") pod \"nova-api-b4cc-account-create-update-c7wqv\" (UID: \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\") " pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.794757 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.827030 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3d564da-9292-4470-98df-262313eab805-operator-scripts\") pod \"nova-cell0-d0a7-account-create-update-zgx2j\" (UID: \"c3d564da-9292-4470-98df-262313eab805\") " pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.827093 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-operator-scripts\") pod \"nova-cell1-db-create-9bpvj\" (UID: \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\") " pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.827350 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx5k2\" (UniqueName: \"kubernetes.io/projected/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-kube-api-access-vx5k2\") pod \"nova-cell1-db-create-9bpvj\" (UID: \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\") " pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.827402 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-287xw\" (UniqueName: \"kubernetes.io/projected/c3d564da-9292-4470-98df-262313eab805-kube-api-access-287xw\") pod \"nova-cell0-d0a7-account-create-update-zgx2j\" (UID: \"c3d564da-9292-4470-98df-262313eab805\") " pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.828900 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-operator-scripts\") pod \"nova-cell1-db-create-9bpvj\" (UID: \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\") " pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.851740 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx5k2\" (UniqueName: \"kubernetes.io/projected/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-kube-api-access-vx5k2\") pod \"nova-cell1-db-create-9bpvj\" (UID: \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\") " pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.861668 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-bc6f-account-create-update-5v5bc"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.863325 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.867661 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.870400 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bc6f-account-create-update-5v5bc"] Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.928624 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-287xw\" (UniqueName: \"kubernetes.io/projected/c3d564da-9292-4470-98df-262313eab805-kube-api-access-287xw\") pod \"nova-cell0-d0a7-account-create-update-zgx2j\" (UID: \"c3d564da-9292-4470-98df-262313eab805\") " pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.928706 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3d564da-9292-4470-98df-262313eab805-operator-scripts\") pod \"nova-cell0-d0a7-account-create-update-zgx2j\" (UID: \"c3d564da-9292-4470-98df-262313eab805\") " pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.929327 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3d564da-9292-4470-98df-262313eab805-operator-scripts\") pod \"nova-cell0-d0a7-account-create-update-zgx2j\" (UID: \"c3d564da-9292-4470-98df-262313eab805\") " pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:10 crc kubenswrapper[4766]: I1209 04:48:10.948977 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-287xw\" (UniqueName: \"kubernetes.io/projected/c3d564da-9292-4470-98df-262313eab805-kube-api-access-287xw\") pod \"nova-cell0-d0a7-account-create-update-zgx2j\" (UID: \"c3d564da-9292-4470-98df-262313eab805\") " pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.030104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-operator-scripts\") pod \"nova-cell1-bc6f-account-create-update-5v5bc\" (UID: \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\") " pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.030297 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnznk\" (UniqueName: \"kubernetes.io/projected/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-kube-api-access-hnznk\") pod \"nova-cell1-bc6f-account-create-update-5v5bc\" (UID: \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\") " pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.045888 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.059744 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.067967 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-n4llq"] Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.131928 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-operator-scripts\") pod \"nova-cell1-bc6f-account-create-update-5v5bc\" (UID: \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\") " pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.132282 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnznk\" (UniqueName: \"kubernetes.io/projected/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-kube-api-access-hnznk\") pod \"nova-cell1-bc6f-account-create-update-5v5bc\" (UID: \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\") " pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.132961 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-operator-scripts\") pod \"nova-cell1-bc6f-account-create-update-5v5bc\" (UID: \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\") " pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.149836 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnznk\" (UniqueName: \"kubernetes.io/projected/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-kube-api-access-hnznk\") pod \"nova-cell1-bc6f-account-create-update-5v5bc\" (UID: \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\") " pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.186877 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.227506 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qnm8w"] Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.322748 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b4cc-account-create-update-c7wqv"] Dec 09 04:48:11 crc kubenswrapper[4766]: W1209 04:48:11.341720 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b0b6d0d_1057_441d_9ed1_ec95aa9f73e6.slice/crio-2126c782c704f3d313f6436cfae3e5afc6b4ac02bed26f860e54fa4de84e9004 WatchSource:0}: Error finding container 2126c782c704f3d313f6436cfae3e5afc6b4ac02bed26f860e54fa4de84e9004: Status 404 returned error can't find the container with id 2126c782c704f3d313f6436cfae3e5afc6b4ac02bed26f860e54fa4de84e9004 Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.461363 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qnm8w" event={"ID":"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d","Type":"ContainerStarted","Data":"e5ba430ea6d7bf02f687174100578640b5e7dea967aadb2ff404b2ac4f3e6ae5"} Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.461412 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qnm8w" event={"ID":"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d","Type":"ContainerStarted","Data":"825d438da27747faf9aecef7c79349e777346b045811fb8ae563c05505dbc016"} Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.466554 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b4cc-account-create-update-c7wqv" event={"ID":"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6","Type":"ContainerStarted","Data":"2126c782c704f3d313f6436cfae3e5afc6b4ac02bed26f860e54fa4de84e9004"} Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.468485 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n4llq" event={"ID":"6005b1e9-bb74-426c-ae88-061a16303fdf","Type":"ContainerStarted","Data":"e9064e2f4b501eeb5dd16aeb6769838465b8cce0ba57d339a919f19b7ea616a7"} Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.468520 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n4llq" event={"ID":"6005b1e9-bb74-426c-ae88-061a16303fdf","Type":"ContainerStarted","Data":"aa18ead358ac0d096cf9e6a4c1c4691f9d784c5a226ec4652158f1902b23aae4"} Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.489451 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-qnm8w" podStartSLOduration=1.489431685 podStartE2EDuration="1.489431685s" podCreationTimestamp="2025-12-09 04:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:11.482878617 +0000 UTC m=+5773.192184043" watchObservedRunningTime="2025-12-09 04:48:11.489431685 +0000 UTC m=+5773.198737111" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.500097 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-n4llq" podStartSLOduration=1.500074772 podStartE2EDuration="1.500074772s" podCreationTimestamp="2025-12-09 04:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:11.49743239 +0000 UTC m=+5773.206737826" watchObservedRunningTime="2025-12-09 04:48:11.500074772 +0000 UTC m=+5773.209380198" Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.546249 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d0a7-account-create-update-zgx2j"] Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.619258 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9bpvj"] Dec 09 04:48:11 crc kubenswrapper[4766]: W1209 04:48:11.627115 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0da2c5d7_0a81_43ac_b5bf_dbab2ec572ed.slice/crio-6bdb63e956c7c5f8c610e0d7d0baf95651b4f37590fd55b3c98948c16c7b8c6f WatchSource:0}: Error finding container 6bdb63e956c7c5f8c610e0d7d0baf95651b4f37590fd55b3c98948c16c7b8c6f: Status 404 returned error can't find the container with id 6bdb63e956c7c5f8c610e0d7d0baf95651b4f37590fd55b3c98948c16c7b8c6f Dec 09 04:48:11 crc kubenswrapper[4766]: I1209 04:48:11.730657 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bc6f-account-create-update-5v5bc"] Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.499013 4766 generic.go:334] "Generic (PLEG): container finished" podID="6005b1e9-bb74-426c-ae88-061a16303fdf" containerID="e9064e2f4b501eeb5dd16aeb6769838465b8cce0ba57d339a919f19b7ea616a7" exitCode=0 Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.499266 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n4llq" event={"ID":"6005b1e9-bb74-426c-ae88-061a16303fdf","Type":"ContainerDied","Data":"e9064e2f4b501eeb5dd16aeb6769838465b8cce0ba57d339a919f19b7ea616a7"} Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.506656 4766 generic.go:334] "Generic (PLEG): container finished" podID="3b843c5e-e828-45a8-ad5c-d45c63c9d4dd" containerID="4976aed5bee782c4796147d30bd20323b7b67bf1b0b3b8a4803d1732f49e2c71" exitCode=0 Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.507111 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" event={"ID":"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd","Type":"ContainerDied","Data":"4976aed5bee782c4796147d30bd20323b7b67bf1b0b3b8a4803d1732f49e2c71"} Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.507157 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" event={"ID":"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd","Type":"ContainerStarted","Data":"98f0f680a2135f6dfd7cbb4f8a93d660bd10a1032c858b8f2282b343fb6ec39b"} Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.512911 4766 generic.go:334] "Generic (PLEG): container finished" podID="0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed" containerID="ea3089b388f93b90c4febab01ccdef0444771b341f536b59f798f9039a8f7bea" exitCode=0 Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.513021 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9bpvj" event={"ID":"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed","Type":"ContainerDied","Data":"ea3089b388f93b90c4febab01ccdef0444771b341f536b59f798f9039a8f7bea"} Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.513101 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9bpvj" event={"ID":"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed","Type":"ContainerStarted","Data":"6bdb63e956c7c5f8c610e0d7d0baf95651b4f37590fd55b3c98948c16c7b8c6f"} Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.515068 4766 generic.go:334] "Generic (PLEG): container finished" podID="0daf6d0a-9850-471a-aaaa-dfe3ca829a7d" containerID="e5ba430ea6d7bf02f687174100578640b5e7dea967aadb2ff404b2ac4f3e6ae5" exitCode=0 Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.515159 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qnm8w" event={"ID":"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d","Type":"ContainerDied","Data":"e5ba430ea6d7bf02f687174100578640b5e7dea967aadb2ff404b2ac4f3e6ae5"} Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.525896 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6" containerID="93b20b732e67e1f721660a541780a2f23abee214b8a04d7d5617c502663c1003" exitCode=0 Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.525984 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b4cc-account-create-update-c7wqv" event={"ID":"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6","Type":"ContainerDied","Data":"93b20b732e67e1f721660a541780a2f23abee214b8a04d7d5617c502663c1003"} Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.529091 4766 generic.go:334] "Generic (PLEG): container finished" podID="c3d564da-9292-4470-98df-262313eab805" containerID="a070f7d80b9e18f2f8b372c2fdde9e51753dfc93f34db06b5b7f6ba29168d39e" exitCode=0 Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.529150 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" event={"ID":"c3d564da-9292-4470-98df-262313eab805","Type":"ContainerDied","Data":"a070f7d80b9e18f2f8b372c2fdde9e51753dfc93f34db06b5b7f6ba29168d39e"} Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.529181 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" event={"ID":"c3d564da-9292-4470-98df-262313eab805","Type":"ContainerStarted","Data":"4bce4ddfa3e40c4550a4cbb077fc766fc25239a63bfd7f6c9947171c5c395969"} Dec 09 04:48:12 crc kubenswrapper[4766]: I1209 04:48:12.839690 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:48:12 crc kubenswrapper[4766]: E1209 04:48:12.840028 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.077604 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.193299 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-operator-scripts\") pod \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\" (UID: \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.193338 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnznk\" (UniqueName: \"kubernetes.io/projected/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-kube-api-access-hnznk\") pod \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\" (UID: \"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.194910 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b843c5e-e828-45a8-ad5c-d45c63c9d4dd" (UID: "3b843c5e-e828-45a8-ad5c-d45c63c9d4dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.200554 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-kube-api-access-hnznk" (OuterVolumeSpecName: "kube-api-access-hnznk") pod "3b843c5e-e828-45a8-ad5c-d45c63c9d4dd" (UID: "3b843c5e-e828-45a8-ad5c-d45c63c9d4dd"). InnerVolumeSpecName "kube-api-access-hnznk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.295899 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.295959 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnznk\" (UniqueName: \"kubernetes.io/projected/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd-kube-api-access-hnznk\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.339689 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.344942 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.355975 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.369831 4766 scope.go:117] "RemoveContainer" containerID="46e4b938be7b973d63f3a22a6fe4b8fb9c73070336a0c4297b06ebcc79f343d9" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.370119 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.381182 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397025 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-operator-scripts\") pod \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\" (UID: \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397109 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-operator-scripts\") pod \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\" (UID: \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397278 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klbg6\" (UniqueName: \"kubernetes.io/projected/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-kube-api-access-klbg6\") pod \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\" (UID: \"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397330 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx5k2\" (UniqueName: \"kubernetes.io/projected/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-kube-api-access-vx5k2\") pod \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\" (UID: \"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397363 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn2dn\" (UniqueName: \"kubernetes.io/projected/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-kube-api-access-fn2dn\") pod \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\" (UID: \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397429 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-operator-scripts\") pod \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\" (UID: \"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397599 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed" (UID: "0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397645 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0daf6d0a-9850-471a-aaaa-dfe3ca829a7d" (UID: "0daf6d0a-9850-471a-aaaa-dfe3ca829a7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397856 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.397879 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.398370 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6" (UID: "1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.403624 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-kube-api-access-klbg6" (OuterVolumeSpecName: "kube-api-access-klbg6") pod "0daf6d0a-9850-471a-aaaa-dfe3ca829a7d" (UID: "0daf6d0a-9850-471a-aaaa-dfe3ca829a7d"). InnerVolumeSpecName "kube-api-access-klbg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.404452 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-kube-api-access-fn2dn" (OuterVolumeSpecName: "kube-api-access-fn2dn") pod "1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6" (UID: "1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6"). InnerVolumeSpecName "kube-api-access-fn2dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.405826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-kube-api-access-vx5k2" (OuterVolumeSpecName: "kube-api-access-vx5k2") pod "0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed" (UID: "0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed"). InnerVolumeSpecName "kube-api-access-vx5k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.498972 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-287xw\" (UniqueName: \"kubernetes.io/projected/c3d564da-9292-4470-98df-262313eab805-kube-api-access-287xw\") pod \"c3d564da-9292-4470-98df-262313eab805\" (UID: \"c3d564da-9292-4470-98df-262313eab805\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.499031 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3d564da-9292-4470-98df-262313eab805-operator-scripts\") pod \"c3d564da-9292-4470-98df-262313eab805\" (UID: \"c3d564da-9292-4470-98df-262313eab805\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.499051 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckrpr\" (UniqueName: \"kubernetes.io/projected/6005b1e9-bb74-426c-ae88-061a16303fdf-kube-api-access-ckrpr\") pod \"6005b1e9-bb74-426c-ae88-061a16303fdf\" (UID: \"6005b1e9-bb74-426c-ae88-061a16303fdf\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.499185 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6005b1e9-bb74-426c-ae88-061a16303fdf-operator-scripts\") pod \"6005b1e9-bb74-426c-ae88-061a16303fdf\" (UID: \"6005b1e9-bb74-426c-ae88-061a16303fdf\") " Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.499593 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.499606 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klbg6\" (UniqueName: \"kubernetes.io/projected/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d-kube-api-access-klbg6\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.499616 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx5k2\" (UniqueName: \"kubernetes.io/projected/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed-kube-api-access-vx5k2\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.499625 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn2dn\" (UniqueName: \"kubernetes.io/projected/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6-kube-api-access-fn2dn\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.499802 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3d564da-9292-4470-98df-262313eab805-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3d564da-9292-4470-98df-262313eab805" (UID: "c3d564da-9292-4470-98df-262313eab805"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.500021 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6005b1e9-bb74-426c-ae88-061a16303fdf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6005b1e9-bb74-426c-ae88-061a16303fdf" (UID: "6005b1e9-bb74-426c-ae88-061a16303fdf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.502350 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d564da-9292-4470-98df-262313eab805-kube-api-access-287xw" (OuterVolumeSpecName: "kube-api-access-287xw") pod "c3d564da-9292-4470-98df-262313eab805" (UID: "c3d564da-9292-4470-98df-262313eab805"). InnerVolumeSpecName "kube-api-access-287xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.502606 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6005b1e9-bb74-426c-ae88-061a16303fdf-kube-api-access-ckrpr" (OuterVolumeSpecName: "kube-api-access-ckrpr") pod "6005b1e9-bb74-426c-ae88-061a16303fdf" (UID: "6005b1e9-bb74-426c-ae88-061a16303fdf"). InnerVolumeSpecName "kube-api-access-ckrpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.550672 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" event={"ID":"c3d564da-9292-4470-98df-262313eab805","Type":"ContainerDied","Data":"4bce4ddfa3e40c4550a4cbb077fc766fc25239a63bfd7f6c9947171c5c395969"} Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.550703 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d0a7-account-create-update-zgx2j" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.550712 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bce4ddfa3e40c4550a4cbb077fc766fc25239a63bfd7f6c9947171c5c395969" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.552884 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n4llq" event={"ID":"6005b1e9-bb74-426c-ae88-061a16303fdf","Type":"ContainerDied","Data":"aa18ead358ac0d096cf9e6a4c1c4691f9d784c5a226ec4652158f1902b23aae4"} Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.552984 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa18ead358ac0d096cf9e6a4c1c4691f9d784c5a226ec4652158f1902b23aae4" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.553110 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n4llq" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.561520 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.561757 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc6f-account-create-update-5v5bc" event={"ID":"3b843c5e-e828-45a8-ad5c-d45c63c9d4dd","Type":"ContainerDied","Data":"98f0f680a2135f6dfd7cbb4f8a93d660bd10a1032c858b8f2282b343fb6ec39b"} Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.561918 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98f0f680a2135f6dfd7cbb4f8a93d660bd10a1032c858b8f2282b343fb6ec39b" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.563983 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9bpvj" event={"ID":"0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed","Type":"ContainerDied","Data":"6bdb63e956c7c5f8c610e0d7d0baf95651b4f37590fd55b3c98948c16c7b8c6f"} Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.564051 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bdb63e956c7c5f8c610e0d7d0baf95651b4f37590fd55b3c98948c16c7b8c6f" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.564153 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9bpvj" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.566177 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qnm8w" event={"ID":"0daf6d0a-9850-471a-aaaa-dfe3ca829a7d","Type":"ContainerDied","Data":"825d438da27747faf9aecef7c79349e777346b045811fb8ae563c05505dbc016"} Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.566231 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="825d438da27747faf9aecef7c79349e777346b045811fb8ae563c05505dbc016" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.566238 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qnm8w" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.568079 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b4cc-account-create-update-c7wqv" event={"ID":"1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6","Type":"ContainerDied","Data":"2126c782c704f3d313f6436cfae3e5afc6b4ac02bed26f860e54fa4de84e9004"} Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.568112 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b4cc-account-create-update-c7wqv" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.568221 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2126c782c704f3d313f6436cfae3e5afc6b4ac02bed26f860e54fa4de84e9004" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.601283 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-287xw\" (UniqueName: \"kubernetes.io/projected/c3d564da-9292-4470-98df-262313eab805-kube-api-access-287xw\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.601312 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3d564da-9292-4470-98df-262313eab805-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.601321 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckrpr\" (UniqueName: \"kubernetes.io/projected/6005b1e9-bb74-426c-ae88-061a16303fdf-kube-api-access-ckrpr\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:14 crc kubenswrapper[4766]: I1209 04:48:14.601330 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6005b1e9-bb74-426c-ae88-061a16303fdf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.887402 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82hr8"] Dec 09 04:48:15 crc kubenswrapper[4766]: E1209 04:48:15.887839 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6005b1e9-bb74-426c-ae88-061a16303fdf" containerName="mariadb-database-create" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.887856 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6005b1e9-bb74-426c-ae88-061a16303fdf" containerName="mariadb-database-create" Dec 09 04:48:15 crc kubenswrapper[4766]: E1209 04:48:15.887870 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6" containerName="mariadb-account-create-update" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.887877 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6" containerName="mariadb-account-create-update" Dec 09 04:48:15 crc kubenswrapper[4766]: E1209 04:48:15.887892 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0daf6d0a-9850-471a-aaaa-dfe3ca829a7d" containerName="mariadb-database-create" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.887900 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0daf6d0a-9850-471a-aaaa-dfe3ca829a7d" containerName="mariadb-database-create" Dec 09 04:48:15 crc kubenswrapper[4766]: E1209 04:48:15.887929 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b843c5e-e828-45a8-ad5c-d45c63c9d4dd" containerName="mariadb-account-create-update" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.887936 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b843c5e-e828-45a8-ad5c-d45c63c9d4dd" containerName="mariadb-account-create-update" Dec 09 04:48:15 crc kubenswrapper[4766]: E1209 04:48:15.887954 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed" containerName="mariadb-database-create" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.887960 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed" containerName="mariadb-database-create" Dec 09 04:48:15 crc kubenswrapper[4766]: E1209 04:48:15.887973 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d564da-9292-4470-98df-262313eab805" containerName="mariadb-account-create-update" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.887980 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d564da-9292-4470-98df-262313eab805" containerName="mariadb-account-create-update" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.888161 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed" containerName="mariadb-database-create" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.888179 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0daf6d0a-9850-471a-aaaa-dfe3ca829a7d" containerName="mariadb-database-create" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.888201 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d564da-9292-4470-98df-262313eab805" containerName="mariadb-account-create-update" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.888309 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b843c5e-e828-45a8-ad5c-d45c63c9d4dd" containerName="mariadb-account-create-update" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.888334 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6" containerName="mariadb-account-create-update" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.888358 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6005b1e9-bb74-426c-ae88-061a16303fdf" containerName="mariadb-database-create" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.889118 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.895705 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.896295 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wksg2" Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.896612 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82hr8"] Dec 09 04:48:15 crc kubenswrapper[4766]: I1209 04:48:15.897072 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.028416 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjgc9\" (UniqueName: \"kubernetes.io/projected/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-kube-api-access-kjgc9\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.029353 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-config-data\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.029625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.029911 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-scripts\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.132373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-config-data\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.132451 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.132510 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-scripts\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.132585 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjgc9\" (UniqueName: \"kubernetes.io/projected/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-kube-api-access-kjgc9\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.140534 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.140795 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-config-data\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.142000 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-scripts\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.163266 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjgc9\" (UniqueName: \"kubernetes.io/projected/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-kube-api-access-kjgc9\") pod \"nova-cell0-conductor-db-sync-82hr8\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.214514 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:16 crc kubenswrapper[4766]: I1209 04:48:16.687384 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82hr8"] Dec 09 04:48:17 crc kubenswrapper[4766]: I1209 04:48:17.598682 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82hr8" event={"ID":"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d","Type":"ContainerStarted","Data":"e5441e9b89d5e07ae1c7cf467200cbf853701173ca108861f591f5d700a5c499"} Dec 09 04:48:17 crc kubenswrapper[4766]: I1209 04:48:17.599039 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82hr8" event={"ID":"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d","Type":"ContainerStarted","Data":"38ec755e4f0298dee48c181c98def8b289da2270d290dfb295938eb2eb4efd6a"} Dec 09 04:48:17 crc kubenswrapper[4766]: I1209 04:48:17.615263 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-82hr8" podStartSLOduration=2.615243126 podStartE2EDuration="2.615243126s" podCreationTimestamp="2025-12-09 04:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:17.613397556 +0000 UTC m=+5779.322702982" watchObservedRunningTime="2025-12-09 04:48:17.615243126 +0000 UTC m=+5779.324548552" Dec 09 04:48:21 crc kubenswrapper[4766]: I1209 04:48:21.641961 4766 generic.go:334] "Generic (PLEG): container finished" podID="2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d" containerID="e5441e9b89d5e07ae1c7cf467200cbf853701173ca108861f591f5d700a5c499" exitCode=0 Dec 09 04:48:21 crc kubenswrapper[4766]: I1209 04:48:21.642094 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82hr8" event={"ID":"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d","Type":"ContainerDied","Data":"e5441e9b89d5e07ae1c7cf467200cbf853701173ca108861f591f5d700a5c499"} Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.053403 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.157550 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjgc9\" (UniqueName: \"kubernetes.io/projected/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-kube-api-access-kjgc9\") pod \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.157641 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-scripts\") pod \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.157692 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-combined-ca-bundle\") pod \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.157822 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-config-data\") pod \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\" (UID: \"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d\") " Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.163346 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-scripts" (OuterVolumeSpecName: "scripts") pod "2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d" (UID: "2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.163526 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-kube-api-access-kjgc9" (OuterVolumeSpecName: "kube-api-access-kjgc9") pod "2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d" (UID: "2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d"). InnerVolumeSpecName "kube-api-access-kjgc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.188634 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d" (UID: "2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.189461 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-config-data" (OuterVolumeSpecName: "config-data") pod "2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d" (UID: "2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.260949 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.261005 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjgc9\" (UniqueName: \"kubernetes.io/projected/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-kube-api-access-kjgc9\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.261030 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.261048 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.668656 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82hr8" event={"ID":"2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d","Type":"ContainerDied","Data":"38ec755e4f0298dee48c181c98def8b289da2270d290dfb295938eb2eb4efd6a"} Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.668977 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38ec755e4f0298dee48c181c98def8b289da2270d290dfb295938eb2eb4efd6a" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.668729 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82hr8" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.742518 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 04:48:23 crc kubenswrapper[4766]: E1209 04:48:23.742879 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d" containerName="nova-cell0-conductor-db-sync" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.742899 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d" containerName="nova-cell0-conductor-db-sync" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.743104 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d" containerName="nova-cell0-conductor-db-sync" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.743745 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.748369 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.748647 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wksg2" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.754459 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.839818 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:48:23 crc kubenswrapper[4766]: E1209 04:48:23.840102 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.876981 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.877073 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.877160 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d5wz\" (UniqueName: \"kubernetes.io/projected/463cd510-5202-403e-8f31-0e0fe1d54659-kube-api-access-8d5wz\") pod \"nova-cell0-conductor-0\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.978683 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d5wz\" (UniqueName: \"kubernetes.io/projected/463cd510-5202-403e-8f31-0e0fe1d54659-kube-api-access-8d5wz\") pod \"nova-cell0-conductor-0\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.978866 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.978897 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.983741 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.984079 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:23 crc kubenswrapper[4766]: I1209 04:48:23.996764 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d5wz\" (UniqueName: \"kubernetes.io/projected/463cd510-5202-403e-8f31-0e0fe1d54659-kube-api-access-8d5wz\") pod \"nova-cell0-conductor-0\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:24 crc kubenswrapper[4766]: I1209 04:48:24.062357 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:24 crc kubenswrapper[4766]: I1209 04:48:24.545942 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 04:48:24 crc kubenswrapper[4766]: I1209 04:48:24.678447 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"463cd510-5202-403e-8f31-0e0fe1d54659","Type":"ContainerStarted","Data":"33bd41231f3d467007d01279abe1e2c1ff89134bc6745fc34243e2084cc5d46f"} Dec 09 04:48:25 crc kubenswrapper[4766]: I1209 04:48:25.686971 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"463cd510-5202-403e-8f31-0e0fe1d54659","Type":"ContainerStarted","Data":"2803d5877db7f73f5e6c9a48d168bb45b5e63292de040cd6a10870bbc31820a5"} Dec 09 04:48:25 crc kubenswrapper[4766]: I1209 04:48:25.687356 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:25 crc kubenswrapper[4766]: I1209 04:48:25.706929 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.7069107839999997 podStartE2EDuration="2.706910784s" podCreationTimestamp="2025-12-09 04:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:25.705535217 +0000 UTC m=+5787.414840643" watchObservedRunningTime="2025-12-09 04:48:25.706910784 +0000 UTC m=+5787.416216220" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.103053 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.539550 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kc2gn"] Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.540957 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.543262 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.550496 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.553989 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kc2gn"] Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.688717 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkn7r\" (UniqueName: \"kubernetes.io/projected/98381e1d-1d58-4913-b9da-e5fd14a6eed9-kube-api-access-hkn7r\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.688985 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.689052 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-scripts\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.689184 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-config-data\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.716258 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.717629 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.720129 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.726820 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.738119 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.740071 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.744858 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.771614 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.792114 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fdwb\" (UniqueName: \"kubernetes.io/projected/1b0eac65-3749-47de-baed-8af5751821d4-kube-api-access-4fdwb\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.792186 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-config-data\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.792234 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkn7r\" (UniqueName: \"kubernetes.io/projected/98381e1d-1d58-4913-b9da-e5fd14a6eed9-kube-api-access-hkn7r\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.792251 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.792286 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.792323 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.792345 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-scripts\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.800066 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-scripts\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.800175 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.814643 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-config-data\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.821612 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkn7r\" (UniqueName: \"kubernetes.io/projected/98381e1d-1d58-4913-b9da-e5fd14a6eed9-kube-api-access-hkn7r\") pod \"nova-cell0-cell-mapping-kc2gn\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.834649 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.837916 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.843051 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.843818 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.877162 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907398 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx45n\" (UniqueName: \"kubernetes.io/projected/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-kube-api-access-zx45n\") pod \"nova-scheduler-0\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907467 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fdwb\" (UniqueName: \"kubernetes.io/projected/1b0eac65-3749-47de-baed-8af5751821d4-kube-api-access-4fdwb\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907520 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-config-data\") pod \"nova-scheduler-0\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907538 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-config-data\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907559 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907587 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907609 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4d5j\" (UniqueName: \"kubernetes.io/projected/add73ffa-6840-4db3-9286-8b647bbd7cfa-kube-api-access-q4d5j\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907626 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907645 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/add73ffa-6840-4db3-9286-8b647bbd7cfa-logs\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.907669 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.916014 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.919221 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.931789 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fdwb\" (UniqueName: \"kubernetes.io/projected/1b0eac65-3749-47de-baed-8af5751821d4-kube-api-access-4fdwb\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.946447 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.947844 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.953513 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 04:48:29 crc kubenswrapper[4766]: I1209 04:48:29.978590 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.008775 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4d5j\" (UniqueName: \"kubernetes.io/projected/add73ffa-6840-4db3-9286-8b647bbd7cfa-kube-api-access-q4d5j\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.008817 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.008840 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/add73ffa-6840-4db3-9286-8b647bbd7cfa-logs\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.008927 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx45n\" (UniqueName: \"kubernetes.io/projected/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-kube-api-access-zx45n\") pod \"nova-scheduler-0\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.008991 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-config-data\") pod \"nova-scheduler-0\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.009008 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-config-data\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.009030 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.010730 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/add73ffa-6840-4db3-9286-8b647bbd7cfa-logs\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.016511 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-config-data\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.021125 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.021301 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.026277 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-config-data\") pod \"nova-scheduler-0\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.034484 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4d5j\" (UniqueName: \"kubernetes.io/projected/add73ffa-6840-4db3-9286-8b647bbd7cfa-kube-api-access-q4d5j\") pod \"nova-api-0\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " pod="openstack/nova-api-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.034514 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx45n\" (UniqueName: \"kubernetes.io/projected/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-kube-api-access-zx45n\") pod \"nova-scheduler-0\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.040123 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.057278 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ff4dfdf57-42jzr"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.069539 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.070842 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.071234 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.086709 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ff4dfdf57-42jzr"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.119811 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-config-data\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.120389 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fw8d\" (UniqueName: \"kubernetes.io/projected/2a144efc-7575-47bf-8620-a248346b2823-kube-api-access-9fw8d\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.123983 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a144efc-7575-47bf-8620-a248346b2823-logs\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.124089 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.209699 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h7l2d"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.211869 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.219697 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7l2d"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228028 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-nb\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228182 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a144efc-7575-47bf-8620-a248346b2823-logs\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228334 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz2vg\" (UniqueName: \"kubernetes.io/projected/e2f2552e-146e-4ba2-aa1e-b1a671255e30-kube-api-access-qz2vg\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228390 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228427 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-config\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228605 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-config-data\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228693 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-sb\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228715 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fw8d\" (UniqueName: \"kubernetes.io/projected/2a144efc-7575-47bf-8620-a248346b2823-kube-api-access-9fw8d\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228778 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-dns-svc\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.228850 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a144efc-7575-47bf-8620-a248346b2823-logs\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.237670 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.241245 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-config-data\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.256729 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fw8d\" (UniqueName: \"kubernetes.io/projected/2a144efc-7575-47bf-8620-a248346b2823-kube-api-access-9fw8d\") pod \"nova-metadata-0\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.330813 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-sb\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.330881 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-dns-svc\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.331888 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-sb\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.331934 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-dns-svc\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.330971 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcrdt\" (UniqueName: \"kubernetes.io/projected/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-kube-api-access-kcrdt\") pod \"redhat-marketplace-h7l2d\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.332002 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-nb\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.332096 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz2vg\" (UniqueName: \"kubernetes.io/projected/e2f2552e-146e-4ba2-aa1e-b1a671255e30-kube-api-access-qz2vg\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.332187 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-config\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.332730 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-nb\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.333003 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-config\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.333049 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-utilities\") pod \"redhat-marketplace-h7l2d\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.333077 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-catalog-content\") pod \"redhat-marketplace-h7l2d\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.351013 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz2vg\" (UniqueName: \"kubernetes.io/projected/e2f2552e-146e-4ba2-aa1e-b1a671255e30-kube-api-access-qz2vg\") pod \"dnsmasq-dns-5ff4dfdf57-42jzr\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.389740 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.406154 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.435882 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-utilities\") pod \"redhat-marketplace-h7l2d\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.435925 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-catalog-content\") pod \"redhat-marketplace-h7l2d\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.436021 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcrdt\" (UniqueName: \"kubernetes.io/projected/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-kube-api-access-kcrdt\") pod \"redhat-marketplace-h7l2d\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.436756 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-utilities\") pod \"redhat-marketplace-h7l2d\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.436986 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-catalog-content\") pod \"redhat-marketplace-h7l2d\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.458131 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcrdt\" (UniqueName: \"kubernetes.io/projected/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-kube-api-access-kcrdt\") pod \"redhat-marketplace-h7l2d\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.533176 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kc2gn"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.591919 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.636748 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7j22r"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.638856 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.642030 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.642035 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.652732 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7j22r"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.717487 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.742626 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8tpb\" (UniqueName: \"kubernetes.io/projected/877cf27e-eccf-4910-97c0-3fc7793f63a9-kube-api-access-d8tpb\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.742691 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-scripts\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.742714 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-config-data\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.742788 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.753238 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kc2gn" event={"ID":"98381e1d-1d58-4913-b9da-e5fd14a6eed9","Type":"ContainerStarted","Data":"bf4147719aae4040d6ee812dd92d9797cb5254bb0e898d2cfa94f22f778e43a6"} Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.844351 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8tpb\" (UniqueName: \"kubernetes.io/projected/877cf27e-eccf-4910-97c0-3fc7793f63a9-kube-api-access-d8tpb\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.844685 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-scripts\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.844704 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-config-data\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.844776 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.849182 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-config-data\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.849335 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-scripts\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.850572 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.863428 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8tpb\" (UniqueName: \"kubernetes.io/projected/877cf27e-eccf-4910-97c0-3fc7793f63a9-kube-api-access-d8tpb\") pod \"nova-cell1-conductor-db-sync-7j22r\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.874081 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:30 crc kubenswrapper[4766]: I1209 04:48:30.915125 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.006337 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.047199 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ff4dfdf57-42jzr"] Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.062027 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:31 crc kubenswrapper[4766]: W1209 04:48:31.064742 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a144efc_7575_47bf_8620_a248346b2823.slice/crio-35ed174622e9242a23bff7ed4bbd3c3e33128f6b55338faa550e48d5311e058e WatchSource:0}: Error finding container 35ed174622e9242a23bff7ed4bbd3c3e33128f6b55338faa550e48d5311e058e: Status 404 returned error can't find the container with id 35ed174622e9242a23bff7ed4bbd3c3e33128f6b55338faa550e48d5311e058e Dec 09 04:48:31 crc kubenswrapper[4766]: W1209 04:48:31.087779 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2f2552e_146e_4ba2_aa1e_b1a671255e30.slice/crio-e932d83c23402177df2c21f741ea1942d157cca4bc99441e3f4e6f16ab8f2ff4 WatchSource:0}: Error finding container e932d83c23402177df2c21f741ea1942d157cca4bc99441e3f4e6f16ab8f2ff4: Status 404 returned error can't find the container with id e932d83c23402177df2c21f741ea1942d157cca4bc99441e3f4e6f16ab8f2ff4 Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.263036 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7l2d"] Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.534539 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7j22r"] Dec 09 04:48:31 crc kubenswrapper[4766]: W1209 04:48:31.564035 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod877cf27e_eccf_4910_97c0_3fc7793f63a9.slice/crio-54af39e62d262f1a54bcaebc889f76a00d11b02372752852aab562feaa43a19c WatchSource:0}: Error finding container 54af39e62d262f1a54bcaebc889f76a00d11b02372752852aab562feaa43a19c: Status 404 returned error can't find the container with id 54af39e62d262f1a54bcaebc889f76a00d11b02372752852aab562feaa43a19c Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.763035 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kc2gn" event={"ID":"98381e1d-1d58-4913-b9da-e5fd14a6eed9","Type":"ContainerStarted","Data":"3ff4eb02b86ecea7771b14f831fe75e24e15cf835687f22ddca2f7eb7fc1606f"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.766949 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"add73ffa-6840-4db3-9286-8b647bbd7cfa","Type":"ContainerStarted","Data":"bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.767024 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"add73ffa-6840-4db3-9286-8b647bbd7cfa","Type":"ContainerStarted","Data":"47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.767037 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"add73ffa-6840-4db3-9286-8b647bbd7cfa","Type":"ContainerStarted","Data":"6fb1f0abbff8b38f4a931592609c5ca121f7e41713ff6eb4bac56666004b498c"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.776256 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7j22r" event={"ID":"877cf27e-eccf-4910-97c0-3fc7793f63a9","Type":"ContainerStarted","Data":"23bc3bba525e3a270ad6e8fcfb6b09a289c9151c273df598f32432e5427f8ff9"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.776311 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7j22r" event={"ID":"877cf27e-eccf-4910-97c0-3fc7793f63a9","Type":"ContainerStarted","Data":"54af39e62d262f1a54bcaebc889f76a00d11b02372752852aab562feaa43a19c"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.781014 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kc2gn" podStartSLOduration=2.780995077 podStartE2EDuration="2.780995077s" podCreationTimestamp="2025-12-09 04:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:31.776257159 +0000 UTC m=+5793.485562605" watchObservedRunningTime="2025-12-09 04:48:31.780995077 +0000 UTC m=+5793.490300503" Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.781064 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b0eac65-3749-47de-baed-8af5751821d4","Type":"ContainerStarted","Data":"93b4178a41e8f4dd192b5955a9da98ea87a6c4a0ca5c33848b9c597a040e0803"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.781113 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b0eac65-3749-47de-baed-8af5751821d4","Type":"ContainerStarted","Data":"ce9287e308388089af896bfa36d09bca38ee22afaf5fe69f1192452031c48209"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.782854 4766 generic.go:334] "Generic (PLEG): container finished" podID="e2f2552e-146e-4ba2-aa1e-b1a671255e30" containerID="25a50d61a5c5e21e089ed4921f38b661893d32c846f68f48a1157866ff620c10" exitCode=0 Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.782900 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" event={"ID":"e2f2552e-146e-4ba2-aa1e-b1a671255e30","Type":"ContainerDied","Data":"25a50d61a5c5e21e089ed4921f38b661893d32c846f68f48a1157866ff620c10"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.782917 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" event={"ID":"e2f2552e-146e-4ba2-aa1e-b1a671255e30","Type":"ContainerStarted","Data":"e932d83c23402177df2c21f741ea1942d157cca4bc99441e3f4e6f16ab8f2ff4"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.787836 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03c4cfa1-d92a-4bae-97a8-37861d7ecf40","Type":"ContainerStarted","Data":"082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.787871 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03c4cfa1-d92a-4bae-97a8-37861d7ecf40","Type":"ContainerStarted","Data":"dcb834d02bafb6df27a92479a5a6c15076f21a8542c6dc5e333468ad5aaa14c6"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.802101 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a144efc-7575-47bf-8620-a248346b2823","Type":"ContainerStarted","Data":"4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.802154 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a144efc-7575-47bf-8620-a248346b2823","Type":"ContainerStarted","Data":"1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.802167 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a144efc-7575-47bf-8620-a248346b2823","Type":"ContainerStarted","Data":"35ed174622e9242a23bff7ed4bbd3c3e33128f6b55338faa550e48d5311e058e"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.808806 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.808789228 podStartE2EDuration="2.808789228s" podCreationTimestamp="2025-12-09 04:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:31.797203515 +0000 UTC m=+5793.506508941" watchObservedRunningTime="2025-12-09 04:48:31.808789228 +0000 UTC m=+5793.518094654" Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.812511 4766 generic.go:334] "Generic (PLEG): container finished" podID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerID="c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050" exitCode=0 Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.812553 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7l2d" event={"ID":"ad8ddd43-a7bd-46b0-99fa-83f2842a408c","Type":"ContainerDied","Data":"c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.812576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7l2d" event={"ID":"ad8ddd43-a7bd-46b0-99fa-83f2842a408c","Type":"ContainerStarted","Data":"f10d0703e1e0bd46e10123b15e74942912ee1a03e542b36fd5aac302fba69110"} Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.827727 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.8277108 podStartE2EDuration="2.8277108s" podCreationTimestamp="2025-12-09 04:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:31.822262923 +0000 UTC m=+5793.531568359" watchObservedRunningTime="2025-12-09 04:48:31.8277108 +0000 UTC m=+5793.537016226" Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.871504 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7j22r" podStartSLOduration=1.8714848530000001 podStartE2EDuration="1.871484853s" podCreationTimestamp="2025-12-09 04:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:31.862981593 +0000 UTC m=+5793.572287029" watchObservedRunningTime="2025-12-09 04:48:31.871484853 +0000 UTC m=+5793.580790279" Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.888326 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.888306048 podStartE2EDuration="2.888306048s" podCreationTimestamp="2025-12-09 04:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:31.87950867 +0000 UTC m=+5793.588814096" watchObservedRunningTime="2025-12-09 04:48:31.888306048 +0000 UTC m=+5793.597611474" Dec 09 04:48:31 crc kubenswrapper[4766]: I1209 04:48:31.924984 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.924965199 podStartE2EDuration="2.924965199s" podCreationTimestamp="2025-12-09 04:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:31.914389694 +0000 UTC m=+5793.623695120" watchObservedRunningTime="2025-12-09 04:48:31.924965199 +0000 UTC m=+5793.634270625" Dec 09 04:48:32 crc kubenswrapper[4766]: I1209 04:48:32.824050 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" event={"ID":"e2f2552e-146e-4ba2-aa1e-b1a671255e30","Type":"ContainerStarted","Data":"c1b58114885327f5c5843ea99009e000dc7f0dbc304a1e8df15ab465ab4fa1cb"} Dec 09 04:48:32 crc kubenswrapper[4766]: I1209 04:48:32.824449 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:32 crc kubenswrapper[4766]: I1209 04:48:32.832181 4766 generic.go:334] "Generic (PLEG): container finished" podID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerID="68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de" exitCode=0 Dec 09 04:48:32 crc kubenswrapper[4766]: I1209 04:48:32.832349 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7l2d" event={"ID":"ad8ddd43-a7bd-46b0-99fa-83f2842a408c","Type":"ContainerDied","Data":"68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de"} Dec 09 04:48:32 crc kubenswrapper[4766]: I1209 04:48:32.896862 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" podStartSLOduration=3.8968213929999997 podStartE2EDuration="3.896821393s" podCreationTimestamp="2025-12-09 04:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:32.851761215 +0000 UTC m=+5794.561066641" watchObservedRunningTime="2025-12-09 04:48:32.896821393 +0000 UTC m=+5794.606126819" Dec 09 04:48:33 crc kubenswrapper[4766]: I1209 04:48:33.845479 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7l2d" event={"ID":"ad8ddd43-a7bd-46b0-99fa-83f2842a408c","Type":"ContainerStarted","Data":"d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6"} Dec 09 04:48:33 crc kubenswrapper[4766]: I1209 04:48:33.874730 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h7l2d" podStartSLOduration=2.404004479 podStartE2EDuration="3.87470485s" podCreationTimestamp="2025-12-09 04:48:30 +0000 UTC" firstStartedPulling="2025-12-09 04:48:31.816314412 +0000 UTC m=+5793.525619838" lastFinishedPulling="2025-12-09 04:48:33.287014783 +0000 UTC m=+5794.996320209" observedRunningTime="2025-12-09 04:48:33.867156997 +0000 UTC m=+5795.576462463" watchObservedRunningTime="2025-12-09 04:48:33.87470485 +0000 UTC m=+5795.584010276" Dec 09 04:48:34 crc kubenswrapper[4766]: I1209 04:48:34.858531 4766 generic.go:334] "Generic (PLEG): container finished" podID="877cf27e-eccf-4910-97c0-3fc7793f63a9" containerID="23bc3bba525e3a270ad6e8fcfb6b09a289c9151c273df598f32432e5427f8ff9" exitCode=0 Dec 09 04:48:34 crc kubenswrapper[4766]: I1209 04:48:34.859316 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7j22r" event={"ID":"877cf27e-eccf-4910-97c0-3fc7793f63a9","Type":"ContainerDied","Data":"23bc3bba525e3a270ad6e8fcfb6b09a289c9151c273df598f32432e5427f8ff9"} Dec 09 04:48:35 crc kubenswrapper[4766]: I1209 04:48:35.041417 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:35 crc kubenswrapper[4766]: I1209 04:48:35.071669 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 04:48:35 crc kubenswrapper[4766]: I1209 04:48:35.390495 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 04:48:35 crc kubenswrapper[4766]: I1209 04:48:35.390574 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 04:48:35 crc kubenswrapper[4766]: I1209 04:48:35.871972 4766 generic.go:334] "Generic (PLEG): container finished" podID="98381e1d-1d58-4913-b9da-e5fd14a6eed9" containerID="3ff4eb02b86ecea7771b14f831fe75e24e15cf835687f22ddca2f7eb7fc1606f" exitCode=0 Dec 09 04:48:35 crc kubenswrapper[4766]: I1209 04:48:35.872084 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kc2gn" event={"ID":"98381e1d-1d58-4913-b9da-e5fd14a6eed9","Type":"ContainerDied","Data":"3ff4eb02b86ecea7771b14f831fe75e24e15cf835687f22ddca2f7eb7fc1606f"} Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.295898 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.465368 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-scripts\") pod \"877cf27e-eccf-4910-97c0-3fc7793f63a9\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.465417 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8tpb\" (UniqueName: \"kubernetes.io/projected/877cf27e-eccf-4910-97c0-3fc7793f63a9-kube-api-access-d8tpb\") pod \"877cf27e-eccf-4910-97c0-3fc7793f63a9\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.465459 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-combined-ca-bundle\") pod \"877cf27e-eccf-4910-97c0-3fc7793f63a9\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.465554 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-config-data\") pod \"877cf27e-eccf-4910-97c0-3fc7793f63a9\" (UID: \"877cf27e-eccf-4910-97c0-3fc7793f63a9\") " Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.471158 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-scripts" (OuterVolumeSpecName: "scripts") pod "877cf27e-eccf-4910-97c0-3fc7793f63a9" (UID: "877cf27e-eccf-4910-97c0-3fc7793f63a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.471434 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877cf27e-eccf-4910-97c0-3fc7793f63a9-kube-api-access-d8tpb" (OuterVolumeSpecName: "kube-api-access-d8tpb") pod "877cf27e-eccf-4910-97c0-3fc7793f63a9" (UID: "877cf27e-eccf-4910-97c0-3fc7793f63a9"). InnerVolumeSpecName "kube-api-access-d8tpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.504305 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "877cf27e-eccf-4910-97c0-3fc7793f63a9" (UID: "877cf27e-eccf-4910-97c0-3fc7793f63a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.507672 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-config-data" (OuterVolumeSpecName: "config-data") pod "877cf27e-eccf-4910-97c0-3fc7793f63a9" (UID: "877cf27e-eccf-4910-97c0-3fc7793f63a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.567723 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.567765 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.567778 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8tpb\" (UniqueName: \"kubernetes.io/projected/877cf27e-eccf-4910-97c0-3fc7793f63a9-kube-api-access-d8tpb\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.567792 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877cf27e-eccf-4910-97c0-3fc7793f63a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.888497 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7j22r" Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.888677 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7j22r" event={"ID":"877cf27e-eccf-4910-97c0-3fc7793f63a9","Type":"ContainerDied","Data":"54af39e62d262f1a54bcaebc889f76a00d11b02372752852aab562feaa43a19c"} Dec 09 04:48:36 crc kubenswrapper[4766]: I1209 04:48:36.888698 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54af39e62d262f1a54bcaebc889f76a00d11b02372752852aab562feaa43a19c" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.062912 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:48:38 crc kubenswrapper[4766]: E1209 04:48:38.071544 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.169666 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 04:48:38 crc kubenswrapper[4766]: E1209 04:48:38.170114 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877cf27e-eccf-4910-97c0-3fc7793f63a9" containerName="nova-cell1-conductor-db-sync" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.170129 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="877cf27e-eccf-4910-97c0-3fc7793f63a9" containerName="nova-cell1-conductor-db-sync" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.170475 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="877cf27e-eccf-4910-97c0-3fc7793f63a9" containerName="nova-cell1-conductor-db-sync" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.171161 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.174286 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.182025 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.356151 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.356276 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t952h\" (UniqueName: \"kubernetes.io/projected/60a1d753-d329-4f92-a520-aac278c6acd9-kube-api-access-t952h\") pod \"nova-cell1-conductor-0\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.356346 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.408768 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.458366 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.458509 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t952h\" (UniqueName: \"kubernetes.io/projected/60a1d753-d329-4f92-a520-aac278c6acd9-kube-api-access-t952h\") pod \"nova-cell1-conductor-0\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.458586 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.466266 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.466275 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.473691 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t952h\" (UniqueName: \"kubernetes.io/projected/60a1d753-d329-4f92-a520-aac278c6acd9-kube-api-access-t952h\") pod \"nova-cell1-conductor-0\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.508414 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.559949 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkn7r\" (UniqueName: \"kubernetes.io/projected/98381e1d-1d58-4913-b9da-e5fd14a6eed9-kube-api-access-hkn7r\") pod \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.560107 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-scripts\") pod \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.560378 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-config-data\") pod \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.560447 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-combined-ca-bundle\") pod \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\" (UID: \"98381e1d-1d58-4913-b9da-e5fd14a6eed9\") " Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.563709 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98381e1d-1d58-4913-b9da-e5fd14a6eed9-kube-api-access-hkn7r" (OuterVolumeSpecName: "kube-api-access-hkn7r") pod "98381e1d-1d58-4913-b9da-e5fd14a6eed9" (UID: "98381e1d-1d58-4913-b9da-e5fd14a6eed9"). InnerVolumeSpecName "kube-api-access-hkn7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.564945 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-scripts" (OuterVolumeSpecName: "scripts") pod "98381e1d-1d58-4913-b9da-e5fd14a6eed9" (UID: "98381e1d-1d58-4913-b9da-e5fd14a6eed9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.595896 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-config-data" (OuterVolumeSpecName: "config-data") pod "98381e1d-1d58-4913-b9da-e5fd14a6eed9" (UID: "98381e1d-1d58-4913-b9da-e5fd14a6eed9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.597175 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98381e1d-1d58-4913-b9da-e5fd14a6eed9" (UID: "98381e1d-1d58-4913-b9da-e5fd14a6eed9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.662115 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.662140 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.662152 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98381e1d-1d58-4913-b9da-e5fd14a6eed9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:38 crc kubenswrapper[4766]: I1209 04:48:38.662162 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkn7r\" (UniqueName: \"kubernetes.io/projected/98381e1d-1d58-4913-b9da-e5fd14a6eed9-kube-api-access-hkn7r\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.000476 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 04:48:39 crc kubenswrapper[4766]: W1209 04:48:39.008275 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60a1d753_d329_4f92_a520_aac278c6acd9.slice/crio-d9dcc239e0cbacd77f380825411f684f9026a3a6f6179c3b11dd77ec5cee9602 WatchSource:0}: Error finding container d9dcc239e0cbacd77f380825411f684f9026a3a6f6179c3b11dd77ec5cee9602: Status 404 returned error can't find the container with id d9dcc239e0cbacd77f380825411f684f9026a3a6f6179c3b11dd77ec5cee9602 Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.120190 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60a1d753-d329-4f92-a520-aac278c6acd9","Type":"ContainerStarted","Data":"d9dcc239e0cbacd77f380825411f684f9026a3a6f6179c3b11dd77ec5cee9602"} Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.122528 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kc2gn" event={"ID":"98381e1d-1d58-4913-b9da-e5fd14a6eed9","Type":"ContainerDied","Data":"bf4147719aae4040d6ee812dd92d9797cb5254bb0e898d2cfa94f22f778e43a6"} Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.122640 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf4147719aae4040d6ee812dd92d9797cb5254bb0e898d2cfa94f22f778e43a6" Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.122874 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kc2gn" Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.245891 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.246461 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="03c4cfa1-d92a-4bae-97a8-37861d7ecf40" containerName="nova-scheduler-scheduler" containerID="cri-o://082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80" gracePeriod=30 Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.260409 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.260684 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerName="nova-api-log" containerID="cri-o://47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec" gracePeriod=30 Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.260828 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerName="nova-api-api" containerID="cri-o://bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25" gracePeriod=30 Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.270042 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.270287 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2a144efc-7575-47bf-8620-a248346b2823" containerName="nova-metadata-log" containerID="cri-o://1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed" gracePeriod=30 Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.270389 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2a144efc-7575-47bf-8620-a248346b2823" containerName="nova-metadata-metadata" containerID="cri-o://4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc" gracePeriod=30 Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.882704 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:48:39 crc kubenswrapper[4766]: I1209 04:48:39.894721 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.009630 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a144efc-7575-47bf-8620-a248346b2823-logs\") pod \"2a144efc-7575-47bf-8620-a248346b2823\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.009729 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4d5j\" (UniqueName: \"kubernetes.io/projected/add73ffa-6840-4db3-9286-8b647bbd7cfa-kube-api-access-q4d5j\") pod \"add73ffa-6840-4db3-9286-8b647bbd7cfa\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.009756 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-config-data\") pod \"2a144efc-7575-47bf-8620-a248346b2823\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.009825 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-combined-ca-bundle\") pod \"2a144efc-7575-47bf-8620-a248346b2823\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.009894 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-config-data\") pod \"add73ffa-6840-4db3-9286-8b647bbd7cfa\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.009952 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/add73ffa-6840-4db3-9286-8b647bbd7cfa-logs\") pod \"add73ffa-6840-4db3-9286-8b647bbd7cfa\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.009969 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a144efc-7575-47bf-8620-a248346b2823-logs" (OuterVolumeSpecName: "logs") pod "2a144efc-7575-47bf-8620-a248346b2823" (UID: "2a144efc-7575-47bf-8620-a248346b2823"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.009977 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-combined-ca-bundle\") pod \"add73ffa-6840-4db3-9286-8b647bbd7cfa\" (UID: \"add73ffa-6840-4db3-9286-8b647bbd7cfa\") " Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.010091 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fw8d\" (UniqueName: \"kubernetes.io/projected/2a144efc-7575-47bf-8620-a248346b2823-kube-api-access-9fw8d\") pod \"2a144efc-7575-47bf-8620-a248346b2823\" (UID: \"2a144efc-7575-47bf-8620-a248346b2823\") " Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.010484 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a144efc-7575-47bf-8620-a248346b2823-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.011631 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/add73ffa-6840-4db3-9286-8b647bbd7cfa-logs" (OuterVolumeSpecName: "logs") pod "add73ffa-6840-4db3-9286-8b647bbd7cfa" (UID: "add73ffa-6840-4db3-9286-8b647bbd7cfa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.017364 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a144efc-7575-47bf-8620-a248346b2823-kube-api-access-9fw8d" (OuterVolumeSpecName: "kube-api-access-9fw8d") pod "2a144efc-7575-47bf-8620-a248346b2823" (UID: "2a144efc-7575-47bf-8620-a248346b2823"). InnerVolumeSpecName "kube-api-access-9fw8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.031515 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add73ffa-6840-4db3-9286-8b647bbd7cfa-kube-api-access-q4d5j" (OuterVolumeSpecName: "kube-api-access-q4d5j") pod "add73ffa-6840-4db3-9286-8b647bbd7cfa" (UID: "add73ffa-6840-4db3-9286-8b647bbd7cfa"). InnerVolumeSpecName "kube-api-access-q4d5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.037908 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-config-data" (OuterVolumeSpecName: "config-data") pod "add73ffa-6840-4db3-9286-8b647bbd7cfa" (UID: "add73ffa-6840-4db3-9286-8b647bbd7cfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.041198 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.047097 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-config-data" (OuterVolumeSpecName: "config-data") pod "2a144efc-7575-47bf-8620-a248346b2823" (UID: "2a144efc-7575-47bf-8620-a248346b2823"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.049564 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a144efc-7575-47bf-8620-a248346b2823" (UID: "2a144efc-7575-47bf-8620-a248346b2823"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.051220 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.060785 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "add73ffa-6840-4db3-9286-8b647bbd7cfa" (UID: "add73ffa-6840-4db3-9286-8b647bbd7cfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.112711 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fw8d\" (UniqueName: \"kubernetes.io/projected/2a144efc-7575-47bf-8620-a248346b2823-kube-api-access-9fw8d\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.112738 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.112747 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4d5j\" (UniqueName: \"kubernetes.io/projected/add73ffa-6840-4db3-9286-8b647bbd7cfa-kube-api-access-q4d5j\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.112759 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a144efc-7575-47bf-8620-a248346b2823-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.112845 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.112855 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/add73ffa-6840-4db3-9286-8b647bbd7cfa-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.112863 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add73ffa-6840-4db3-9286-8b647bbd7cfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.131131 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60a1d753-d329-4f92-a520-aac278c6acd9","Type":"ContainerStarted","Data":"0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff"} Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.132179 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.134381 4766 generic.go:334] "Generic (PLEG): container finished" podID="2a144efc-7575-47bf-8620-a248346b2823" containerID="4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc" exitCode=0 Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.134403 4766 generic.go:334] "Generic (PLEG): container finished" podID="2a144efc-7575-47bf-8620-a248346b2823" containerID="1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed" exitCode=143 Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.134441 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a144efc-7575-47bf-8620-a248346b2823","Type":"ContainerDied","Data":"4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc"} Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.134461 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a144efc-7575-47bf-8620-a248346b2823","Type":"ContainerDied","Data":"1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed"} Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.134474 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a144efc-7575-47bf-8620-a248346b2823","Type":"ContainerDied","Data":"35ed174622e9242a23bff7ed4bbd3c3e33128f6b55338faa550e48d5311e058e"} Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.134493 4766 scope.go:117] "RemoveContainer" containerID="4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.134584 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.139031 4766 generic.go:334] "Generic (PLEG): container finished" podID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerID="bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25" exitCode=0 Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.139047 4766 generic.go:334] "Generic (PLEG): container finished" podID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerID="47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec" exitCode=143 Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.139060 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"add73ffa-6840-4db3-9286-8b647bbd7cfa","Type":"ContainerDied","Data":"bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25"} Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.139078 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"add73ffa-6840-4db3-9286-8b647bbd7cfa","Type":"ContainerDied","Data":"47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec"} Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.139090 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"add73ffa-6840-4db3-9286-8b647bbd7cfa","Type":"ContainerDied","Data":"6fb1f0abbff8b38f4a931592609c5ca121f7e41713ff6eb4bac56666004b498c"} Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.139103 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.155369 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.161040 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.16102042 podStartE2EDuration="2.16102042s" podCreationTimestamp="2025-12-09 04:48:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:40.155315626 +0000 UTC m=+5801.864621052" watchObservedRunningTime="2025-12-09 04:48:40.16102042 +0000 UTC m=+5801.870325846" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.165774 4766 scope.go:117] "RemoveContainer" containerID="1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.185702 4766 scope.go:117] "RemoveContainer" containerID="4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc" Dec 09 04:48:40 crc kubenswrapper[4766]: E1209 04:48:40.189990 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc\": container with ID starting with 4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc not found: ID does not exist" containerID="4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.190055 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc"} err="failed to get container status \"4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc\": rpc error: code = NotFound desc = could not find container \"4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc\": container with ID starting with 4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc not found: ID does not exist" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.190103 4766 scope.go:117] "RemoveContainer" containerID="1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed" Dec 09 04:48:40 crc kubenswrapper[4766]: E1209 04:48:40.199496 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed\": container with ID starting with 1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed not found: ID does not exist" containerID="1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.199548 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed"} err="failed to get container status \"1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed\": rpc error: code = NotFound desc = could not find container \"1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed\": container with ID starting with 1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed not found: ID does not exist" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.199575 4766 scope.go:117] "RemoveContainer" containerID="4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.199863 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc"} err="failed to get container status \"4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc\": rpc error: code = NotFound desc = could not find container \"4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc\": container with ID starting with 4a2e2d33913cec8724e013dd02153af7aa61dcc42f4f5e004b52bb44716c5bdc not found: ID does not exist" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.199887 4766 scope.go:117] "RemoveContainer" containerID="1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.200076 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed"} err="failed to get container status \"1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed\": rpc error: code = NotFound desc = could not find container \"1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed\": container with ID starting with 1e06cc7615bb62fcd901a81b98c61edf1feb44920e7ef134972cf221ce8715ed not found: ID does not exist" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.200095 4766 scope.go:117] "RemoveContainer" containerID="bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.206032 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.229530 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.245282 4766 scope.go:117] "RemoveContainer" containerID="47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.245397 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.256424 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.300269 4766 scope.go:117] "RemoveContainer" containerID="bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.300324 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:40 crc kubenswrapper[4766]: E1209 04:48:40.301689 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25\": container with ID starting with bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25 not found: ID does not exist" containerID="bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.301939 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25"} err="failed to get container status \"bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25\": rpc error: code = NotFound desc = could not find container \"bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25\": container with ID starting with bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25 not found: ID does not exist" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.302083 4766 scope.go:117] "RemoveContainer" containerID="47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec" Dec 09 04:48:40 crc kubenswrapper[4766]: E1209 04:48:40.301791 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerName="nova-api-log" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.306803 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerName="nova-api-log" Dec 09 04:48:40 crc kubenswrapper[4766]: E1209 04:48:40.307143 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a144efc-7575-47bf-8620-a248346b2823" containerName="nova-metadata-log" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.307255 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a144efc-7575-47bf-8620-a248346b2823" containerName="nova-metadata-log" Dec 09 04:48:40 crc kubenswrapper[4766]: E1209 04:48:40.307379 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a144efc-7575-47bf-8620-a248346b2823" containerName="nova-metadata-metadata" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.307464 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a144efc-7575-47bf-8620-a248346b2823" containerName="nova-metadata-metadata" Dec 09 04:48:40 crc kubenswrapper[4766]: E1209 04:48:40.307570 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerName="nova-api-api" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.307655 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerName="nova-api-api" Dec 09 04:48:40 crc kubenswrapper[4766]: E1209 04:48:40.307759 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98381e1d-1d58-4913-b9da-e5fd14a6eed9" containerName="nova-manage" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.307834 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="98381e1d-1d58-4913-b9da-e5fd14a6eed9" containerName="nova-manage" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.308842 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a144efc-7575-47bf-8620-a248346b2823" containerName="nova-metadata-metadata" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.308960 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerName="nova-api-api" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.309055 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="98381e1d-1d58-4913-b9da-e5fd14a6eed9" containerName="nova-manage" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.309157 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="add73ffa-6840-4db3-9286-8b647bbd7cfa" containerName="nova-api-log" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.309258 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a144efc-7575-47bf-8620-a248346b2823" containerName="nova-metadata-log" Dec 09 04:48:40 crc kubenswrapper[4766]: E1209 04:48:40.312463 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec\": container with ID starting with 47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec not found: ID does not exist" containerID="47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.312718 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec"} err="failed to get container status \"47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec\": rpc error: code = NotFound desc = could not find container \"47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec\": container with ID starting with 47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec not found: ID does not exist" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.312892 4766 scope.go:117] "RemoveContainer" containerID="bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.317791 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25"} err="failed to get container status \"bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25\": rpc error: code = NotFound desc = could not find container \"bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25\": container with ID starting with bc2433d0afcb1f00963a2fc554acac3cd4454e007a6ef8c9bed9ca704195fc25 not found: ID does not exist" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.317847 4766 scope.go:117] "RemoveContainer" containerID="47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.318531 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.325436 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.336674 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec"} err="failed to get container status \"47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec\": rpc error: code = NotFound desc = could not find container \"47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec\": container with ID starting with 47bc14e4097396270ede70f9021cbeb12e9d9aca540ac4cd4fcf251679c84dec not found: ID does not exist" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.407197 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.417478 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.428085 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.428690 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-config-data\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.428806 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmrv6\" (UniqueName: \"kubernetes.io/projected/9090014c-4a18-45da-8f15-7d3f8ae49bc7-kube-api-access-cmrv6\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.428961 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9090014c-4a18-45da-8f15-7d3f8ae49bc7-logs\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.461321 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.462758 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.480492 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.502454 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.530501 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-config-data\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.536364 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9aacd-c83f-4fb4-9355-9d9e67816128-logs\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.536531 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.536891 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.537272 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-config-data\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.537486 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmrv6\" (UniqueName: \"kubernetes.io/projected/9090014c-4a18-45da-8f15-7d3f8ae49bc7-kube-api-access-cmrv6\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.537715 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48wr5\" (UniqueName: \"kubernetes.io/projected/0be9aacd-c83f-4fb4-9355-9d9e67816128-kube-api-access-48wr5\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.537870 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9090014c-4a18-45da-8f15-7d3f8ae49bc7-logs\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.538509 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9090014c-4a18-45da-8f15-7d3f8ae49bc7-logs\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.556428 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-config-data\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.556768 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.562086 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmrv6\" (UniqueName: \"kubernetes.io/projected/9090014c-4a18-45da-8f15-7d3f8ae49bc7-kube-api-access-cmrv6\") pod \"nova-metadata-0\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.585680 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-664dd56477-qp45x"] Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.585890 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-664dd56477-qp45x" podUID="60a38c6f-bc51-42ac-84f0-0ad2dda1c886" containerName="dnsmasq-dns" containerID="cri-o://cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161" gracePeriod=10 Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.592650 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.592704 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.639838 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48wr5\" (UniqueName: \"kubernetes.io/projected/0be9aacd-c83f-4fb4-9355-9d9e67816128-kube-api-access-48wr5\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.639900 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-config-data\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.639968 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.639991 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9aacd-c83f-4fb4-9355-9d9e67816128-logs\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.640497 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9aacd-c83f-4fb4-9355-9d9e67816128-logs\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.643714 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.643812 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-config-data\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.655503 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.659979 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48wr5\" (UniqueName: \"kubernetes.io/projected/0be9aacd-c83f-4fb4-9355-9d9e67816128-kube-api-access-48wr5\") pod \"nova-api-0\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.810507 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.846572 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.850908 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a144efc-7575-47bf-8620-a248346b2823" path="/var/lib/kubelet/pods/2a144efc-7575-47bf-8620-a248346b2823/volumes" Dec 09 04:48:40 crc kubenswrapper[4766]: I1209 04:48:40.851684 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add73ffa-6840-4db3-9286-8b647bbd7cfa" path="/var/lib/kubelet/pods/add73ffa-6840-4db3-9286-8b647bbd7cfa/volumes" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.015685 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.152442 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfdvw\" (UniqueName: \"kubernetes.io/projected/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-kube-api-access-zfdvw\") pod \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.153053 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-nb\") pod \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.153107 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-sb\") pod \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.153437 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-config\") pod \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.153481 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-dns-svc\") pod \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\" (UID: \"60a38c6f-bc51-42ac-84f0-0ad2dda1c886\") " Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.160764 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-kube-api-access-zfdvw" (OuterVolumeSpecName: "kube-api-access-zfdvw") pod "60a38c6f-bc51-42ac-84f0-0ad2dda1c886" (UID: "60a38c6f-bc51-42ac-84f0-0ad2dda1c886"). InnerVolumeSpecName "kube-api-access-zfdvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.160892 4766 generic.go:334] "Generic (PLEG): container finished" podID="60a38c6f-bc51-42ac-84f0-0ad2dda1c886" containerID="cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161" exitCode=0 Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.160957 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664dd56477-qp45x" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.160950 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664dd56477-qp45x" event={"ID":"60a38c6f-bc51-42ac-84f0-0ad2dda1c886","Type":"ContainerDied","Data":"cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161"} Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.161077 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664dd56477-qp45x" event={"ID":"60a38c6f-bc51-42ac-84f0-0ad2dda1c886","Type":"ContainerDied","Data":"0f931cecffed7e1332881c664ee59df8f65c831efbc712851ece5b42aea4904f"} Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.161103 4766 scope.go:117] "RemoveContainer" containerID="cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.193466 4766 scope.go:117] "RemoveContainer" containerID="00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.214360 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60a38c6f-bc51-42ac-84f0-0ad2dda1c886" (UID: "60a38c6f-bc51-42ac-84f0-0ad2dda1c886"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.222904 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60a38c6f-bc51-42ac-84f0-0ad2dda1c886" (UID: "60a38c6f-bc51-42ac-84f0-0ad2dda1c886"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.225773 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-config" (OuterVolumeSpecName: "config") pod "60a38c6f-bc51-42ac-84f0-0ad2dda1c886" (UID: "60a38c6f-bc51-42ac-84f0-0ad2dda1c886"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.231979 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60a38c6f-bc51-42ac-84f0-0ad2dda1c886" (UID: "60a38c6f-bc51-42ac-84f0-0ad2dda1c886"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.240264 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.249328 4766 scope.go:117] "RemoveContainer" containerID="cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161" Dec 09 04:48:41 crc kubenswrapper[4766]: E1209 04:48:41.250078 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161\": container with ID starting with cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161 not found: ID does not exist" containerID="cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.250110 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161"} err="failed to get container status \"cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161\": rpc error: code = NotFound desc = could not find container \"cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161\": container with ID starting with cd6795ad4b03e9418027dea3350f06804ee90a272914b09a9ae9a6d3f4402161 not found: ID does not exist" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.250133 4766 scope.go:117] "RemoveContainer" containerID="00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d" Dec 09 04:48:41 crc kubenswrapper[4766]: E1209 04:48:41.250488 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d\": container with ID starting with 00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d not found: ID does not exist" containerID="00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.250506 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d"} err="failed to get container status \"00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d\": rpc error: code = NotFound desc = could not find container \"00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d\": container with ID starting with 00dcc185a7fb92a713789170f817428f76fe81e09d3e00796aec29a343fb463d not found: ID does not exist" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.255749 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.255775 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.255787 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfdvw\" (UniqueName: \"kubernetes.io/projected/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-kube-api-access-zfdvw\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.255798 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.255809 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60a38c6f-bc51-42ac-84f0-0ad2dda1c886-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.289021 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7l2d"] Dec 09 04:48:41 crc kubenswrapper[4766]: I1209 04:48:41.332328 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:41.441323 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:41.569013 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-664dd56477-qp45x"] Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:41.579098 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-664dd56477-qp45x"] Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.181325 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0be9aacd-c83f-4fb4-9355-9d9e67816128","Type":"ContainerStarted","Data":"31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489"} Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.182266 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0be9aacd-c83f-4fb4-9355-9d9e67816128","Type":"ContainerStarted","Data":"4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c"} Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.182346 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0be9aacd-c83f-4fb4-9355-9d9e67816128","Type":"ContainerStarted","Data":"381321f457543ed6f705e1fb529950aa74ed84798cca5e365e7d1d3bbcec8f7c"} Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.185573 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9090014c-4a18-45da-8f15-7d3f8ae49bc7","Type":"ContainerStarted","Data":"41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f"} Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.185627 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9090014c-4a18-45da-8f15-7d3f8ae49bc7","Type":"ContainerStarted","Data":"bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2"} Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.185641 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9090014c-4a18-45da-8f15-7d3f8ae49bc7","Type":"ContainerStarted","Data":"c8e63bc85dd3fc10c6344378c41784c7c597d0cff99836f404994eaa8a961b20"} Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.205667 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.205651497 podStartE2EDuration="2.205651497s" podCreationTimestamp="2025-12-09 04:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:42.199860391 +0000 UTC m=+5803.909165817" watchObservedRunningTime="2025-12-09 04:48:42.205651497 +0000 UTC m=+5803.914956923" Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.226072 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.226054728 podStartE2EDuration="2.226054728s" podCreationTimestamp="2025-12-09 04:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:42.216956463 +0000 UTC m=+5803.926261889" watchObservedRunningTime="2025-12-09 04:48:42.226054728 +0000 UTC m=+5803.935360154" Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.733700 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.851580 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a38c6f-bc51-42ac-84f0-0ad2dda1c886" path="/var/lib/kubelet/pods/60a38c6f-bc51-42ac-84f0-0ad2dda1c886/volumes" Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.882074 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-combined-ca-bundle\") pod \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.882351 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx45n\" (UniqueName: \"kubernetes.io/projected/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-kube-api-access-zx45n\") pod \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.882449 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-config-data\") pod \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\" (UID: \"03c4cfa1-d92a-4bae-97a8-37861d7ecf40\") " Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.886823 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-kube-api-access-zx45n" (OuterVolumeSpecName: "kube-api-access-zx45n") pod "03c4cfa1-d92a-4bae-97a8-37861d7ecf40" (UID: "03c4cfa1-d92a-4bae-97a8-37861d7ecf40"). InnerVolumeSpecName "kube-api-access-zx45n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.904753 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-config-data" (OuterVolumeSpecName: "config-data") pod "03c4cfa1-d92a-4bae-97a8-37861d7ecf40" (UID: "03c4cfa1-d92a-4bae-97a8-37861d7ecf40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.911914 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03c4cfa1-d92a-4bae-97a8-37861d7ecf40" (UID: "03c4cfa1-d92a-4bae-97a8-37861d7ecf40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.984998 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx45n\" (UniqueName: \"kubernetes.io/projected/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-kube-api-access-zx45n\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.985040 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:42 crc kubenswrapper[4766]: I1209 04:48:42.985052 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c4cfa1-d92a-4bae-97a8-37861d7ecf40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.199325 4766 generic.go:334] "Generic (PLEG): container finished" podID="03c4cfa1-d92a-4bae-97a8-37861d7ecf40" containerID="082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80" exitCode=0 Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.199468 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.199473 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03c4cfa1-d92a-4bae-97a8-37861d7ecf40","Type":"ContainerDied","Data":"082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80"} Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.199561 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03c4cfa1-d92a-4bae-97a8-37861d7ecf40","Type":"ContainerDied","Data":"dcb834d02bafb6df27a92479a5a6c15076f21a8542c6dc5e333468ad5aaa14c6"} Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.199612 4766 scope.go:117] "RemoveContainer" containerID="082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.200532 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h7l2d" podUID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerName="registry-server" containerID="cri-o://d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6" gracePeriod=2 Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.243880 4766 scope.go:117] "RemoveContainer" containerID="082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80" Dec 09 04:48:43 crc kubenswrapper[4766]: E1209 04:48:43.244378 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80\": container with ID starting with 082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80 not found: ID does not exist" containerID="082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.244413 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80"} err="failed to get container status \"082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80\": rpc error: code = NotFound desc = could not find container \"082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80\": container with ID starting with 082b1168df09c0ff73ef066f652c5778959eed93471fbaa9d2b4461b5b586f80 not found: ID does not exist" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.257639 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.270777 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.281973 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:43 crc kubenswrapper[4766]: E1209 04:48:43.282454 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a38c6f-bc51-42ac-84f0-0ad2dda1c886" containerName="init" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.282475 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a38c6f-bc51-42ac-84f0-0ad2dda1c886" containerName="init" Dec 09 04:48:43 crc kubenswrapper[4766]: E1209 04:48:43.282488 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a38c6f-bc51-42ac-84f0-0ad2dda1c886" containerName="dnsmasq-dns" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.282495 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a38c6f-bc51-42ac-84f0-0ad2dda1c886" containerName="dnsmasq-dns" Dec 09 04:48:43 crc kubenswrapper[4766]: E1209 04:48:43.282515 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c4cfa1-d92a-4bae-97a8-37861d7ecf40" containerName="nova-scheduler-scheduler" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.282521 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c4cfa1-d92a-4bae-97a8-37861d7ecf40" containerName="nova-scheduler-scheduler" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.282699 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c4cfa1-d92a-4bae-97a8-37861d7ecf40" containerName="nova-scheduler-scheduler" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.282721 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a38c6f-bc51-42ac-84f0-0ad2dda1c886" containerName="dnsmasq-dns" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.283390 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.286203 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.308938 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.400758 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hhr9\" (UniqueName: \"kubernetes.io/projected/83885030-1eda-4c1e-954f-6793d46f62f0-kube-api-access-4hhr9\") pod \"nova-scheduler-0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.400886 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-config-data\") pod \"nova-scheduler-0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.400932 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.502693 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hhr9\" (UniqueName: \"kubernetes.io/projected/83885030-1eda-4c1e-954f-6793d46f62f0-kube-api-access-4hhr9\") pod \"nova-scheduler-0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.502757 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-config-data\") pod \"nova-scheduler-0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.502785 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.506288 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-config-data\") pod \"nova-scheduler-0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.508132 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.530778 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hhr9\" (UniqueName: \"kubernetes.io/projected/83885030-1eda-4c1e-954f-6793d46f62f0-kube-api-access-4hhr9\") pod \"nova-scheduler-0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.626779 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.704914 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-utilities\") pod \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.705082 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcrdt\" (UniqueName: \"kubernetes.io/projected/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-kube-api-access-kcrdt\") pod \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.705642 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-catalog-content\") pod \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\" (UID: \"ad8ddd43-a7bd-46b0-99fa-83f2842a408c\") " Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.706050 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-utilities" (OuterVolumeSpecName: "utilities") pod "ad8ddd43-a7bd-46b0-99fa-83f2842a408c" (UID: "ad8ddd43-a7bd-46b0-99fa-83f2842a408c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.706543 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.709181 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-kube-api-access-kcrdt" (OuterVolumeSpecName: "kube-api-access-kcrdt") pod "ad8ddd43-a7bd-46b0-99fa-83f2842a408c" (UID: "ad8ddd43-a7bd-46b0-99fa-83f2842a408c"). InnerVolumeSpecName "kube-api-access-kcrdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.717966 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.737323 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad8ddd43-a7bd-46b0-99fa-83f2842a408c" (UID: "ad8ddd43-a7bd-46b0-99fa-83f2842a408c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.810957 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcrdt\" (UniqueName: \"kubernetes.io/projected/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-kube-api-access-kcrdt\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:43 crc kubenswrapper[4766]: I1209 04:48:43.810991 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad8ddd43-a7bd-46b0-99fa-83f2842a408c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.195187 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:44 crc kubenswrapper[4766]: W1209 04:48:44.196548 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83885030_1eda_4c1e_954f_6793d46f62f0.slice/crio-fd2220d6a104d320ef651e4595368f48fd77d42a426c8fd0e0d57d2b94f2924a WatchSource:0}: Error finding container fd2220d6a104d320ef651e4595368f48fd77d42a426c8fd0e0d57d2b94f2924a: Status 404 returned error can't find the container with id fd2220d6a104d320ef651e4595368f48fd77d42a426c8fd0e0d57d2b94f2924a Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.210085 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"83885030-1eda-4c1e-954f-6793d46f62f0","Type":"ContainerStarted","Data":"fd2220d6a104d320ef651e4595368f48fd77d42a426c8fd0e0d57d2b94f2924a"} Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.217339 4766 generic.go:334] "Generic (PLEG): container finished" podID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerID="d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6" exitCode=0 Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.217452 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7l2d" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.217530 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7l2d" event={"ID":"ad8ddd43-a7bd-46b0-99fa-83f2842a408c","Type":"ContainerDied","Data":"d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6"} Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.217589 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7l2d" event={"ID":"ad8ddd43-a7bd-46b0-99fa-83f2842a408c","Type":"ContainerDied","Data":"f10d0703e1e0bd46e10123b15e74942912ee1a03e542b36fd5aac302fba69110"} Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.217613 4766 scope.go:117] "RemoveContainer" containerID="d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.241485 4766 scope.go:117] "RemoveContainer" containerID="68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.264061 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7l2d"] Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.269570 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7l2d"] Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.289889 4766 scope.go:117] "RemoveContainer" containerID="c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.316335 4766 scope.go:117] "RemoveContainer" containerID="d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6" Dec 09 04:48:44 crc kubenswrapper[4766]: E1209 04:48:44.316672 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6\": container with ID starting with d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6 not found: ID does not exist" containerID="d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.316706 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6"} err="failed to get container status \"d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6\": rpc error: code = NotFound desc = could not find container \"d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6\": container with ID starting with d4e30c0c0bf15d19c500f9ed4f15698df653c962f58a783757cf99932424f3c6 not found: ID does not exist" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.316729 4766 scope.go:117] "RemoveContainer" containerID="68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de" Dec 09 04:48:44 crc kubenswrapper[4766]: E1209 04:48:44.316939 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de\": container with ID starting with 68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de not found: ID does not exist" containerID="68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.316967 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de"} err="failed to get container status \"68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de\": rpc error: code = NotFound desc = could not find container \"68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de\": container with ID starting with 68c69c6bfbcc420d978857ae95a8820801a98edfcca2227de74ea772ae2898de not found: ID does not exist" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.316984 4766 scope.go:117] "RemoveContainer" containerID="c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050" Dec 09 04:48:44 crc kubenswrapper[4766]: E1209 04:48:44.317374 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050\": container with ID starting with c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050 not found: ID does not exist" containerID="c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.317405 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050"} err="failed to get container status \"c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050\": rpc error: code = NotFound desc = could not find container \"c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050\": container with ID starting with c9d736cf59785c82e045521f59426f76e3c580003bed38a5bff9535f422cd050 not found: ID does not exist" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.848527 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03c4cfa1-d92a-4bae-97a8-37861d7ecf40" path="/var/lib/kubelet/pods/03c4cfa1-d92a-4bae-97a8-37861d7ecf40/volumes" Dec 09 04:48:44 crc kubenswrapper[4766]: I1209 04:48:44.849252 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" path="/var/lib/kubelet/pods/ad8ddd43-a7bd-46b0-99fa-83f2842a408c/volumes" Dec 09 04:48:45 crc kubenswrapper[4766]: I1209 04:48:45.231698 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"83885030-1eda-4c1e-954f-6793d46f62f0","Type":"ContainerStarted","Data":"da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e"} Dec 09 04:48:45 crc kubenswrapper[4766]: I1209 04:48:45.275576 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.275554051 podStartE2EDuration="2.275554051s" podCreationTimestamp="2025-12-09 04:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:45.267676868 +0000 UTC m=+5806.976982304" watchObservedRunningTime="2025-12-09 04:48:45.275554051 +0000 UTC m=+5806.984859487" Dec 09 04:48:45 crc kubenswrapper[4766]: I1209 04:48:45.811729 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 04:48:45 crc kubenswrapper[4766]: I1209 04:48:45.812240 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 04:48:48 crc kubenswrapper[4766]: I1209 04:48:48.562205 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 09 04:48:48 crc kubenswrapper[4766]: I1209 04:48:48.718363 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.088668 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7c8md"] Dec 09 04:48:49 crc kubenswrapper[4766]: E1209 04:48:49.089055 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerName="extract-utilities" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.089071 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerName="extract-utilities" Dec 09 04:48:49 crc kubenswrapper[4766]: E1209 04:48:49.089097 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerName="registry-server" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.089104 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerName="registry-server" Dec 09 04:48:49 crc kubenswrapper[4766]: E1209 04:48:49.089126 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerName="extract-content" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.089133 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerName="extract-content" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.089308 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8ddd43-a7bd-46b0-99fa-83f2842a408c" containerName="registry-server" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.089965 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.095027 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.096839 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.107029 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7c8md"] Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.254663 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-scripts\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.255625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld5pp\" (UniqueName: \"kubernetes.io/projected/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-kube-api-access-ld5pp\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.256030 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-config-data\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.256090 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.359470 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld5pp\" (UniqueName: \"kubernetes.io/projected/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-kube-api-access-ld5pp\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.359538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-config-data\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.359578 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.359675 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-scripts\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.366712 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-scripts\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.367291 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-config-data\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.368041 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.376815 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld5pp\" (UniqueName: \"kubernetes.io/projected/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-kube-api-access-ld5pp\") pod \"nova-cell1-cell-mapping-7c8md\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.468711 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.839143 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:48:49 crc kubenswrapper[4766]: E1209 04:48:49.839618 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:48:49 crc kubenswrapper[4766]: I1209 04:48:49.947889 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7c8md"] Dec 09 04:48:50 crc kubenswrapper[4766]: I1209 04:48:50.294126 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7c8md" event={"ID":"b4e53d15-c42f-4959-9ed5-ddff8241b3d0","Type":"ContainerStarted","Data":"7a606ef6a26c059612d4536fbcc43763558f4d1f249a3613caa872f83e394901"} Dec 09 04:48:50 crc kubenswrapper[4766]: I1209 04:48:50.294913 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7c8md" event={"ID":"b4e53d15-c42f-4959-9ed5-ddff8241b3d0","Type":"ContainerStarted","Data":"973a5135363e42557d72de87c71619154099bf3d76f7a044e0419fea3c5a53b1"} Dec 09 04:48:50 crc kubenswrapper[4766]: I1209 04:48:50.315095 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7c8md" podStartSLOduration=1.315074844 podStartE2EDuration="1.315074844s" podCreationTimestamp="2025-12-09 04:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:48:50.310251414 +0000 UTC m=+5812.019556850" watchObservedRunningTime="2025-12-09 04:48:50.315074844 +0000 UTC m=+5812.024380280" Dec 09 04:48:50 crc kubenswrapper[4766]: I1209 04:48:50.811741 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 04:48:50 crc kubenswrapper[4766]: I1209 04:48:50.812005 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 04:48:50 crc kubenswrapper[4766]: I1209 04:48:50.851911 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 04:48:50 crc kubenswrapper[4766]: I1209 04:48:50.851966 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 04:48:51 crc kubenswrapper[4766]: I1209 04:48:51.979382 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:48:51 crc kubenswrapper[4766]: I1209 04:48:51.979382 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:48:51 crc kubenswrapper[4766]: I1209 04:48:51.979408 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:48:51 crc kubenswrapper[4766]: I1209 04:48:51.979431 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:48:53 crc kubenswrapper[4766]: I1209 04:48:53.724197 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 04:48:53 crc kubenswrapper[4766]: I1209 04:48:53.753460 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 04:48:54 crc kubenswrapper[4766]: I1209 04:48:54.451261 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 04:48:55 crc kubenswrapper[4766]: I1209 04:48:55.339228 4766 generic.go:334] "Generic (PLEG): container finished" podID="b4e53d15-c42f-4959-9ed5-ddff8241b3d0" containerID="7a606ef6a26c059612d4536fbcc43763558f4d1f249a3613caa872f83e394901" exitCode=0 Dec 09 04:48:55 crc kubenswrapper[4766]: I1209 04:48:55.339315 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7c8md" event={"ID":"b4e53d15-c42f-4959-9ed5-ddff8241b3d0","Type":"ContainerDied","Data":"7a606ef6a26c059612d4536fbcc43763558f4d1f249a3613caa872f83e394901"} Dec 09 04:48:56 crc kubenswrapper[4766]: I1209 04:48:56.740556 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:56 crc kubenswrapper[4766]: I1209 04:48:56.903393 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-config-data\") pod \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " Dec 09 04:48:56 crc kubenswrapper[4766]: I1209 04:48:56.903622 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-combined-ca-bundle\") pod \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " Dec 09 04:48:56 crc kubenswrapper[4766]: I1209 04:48:56.903704 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld5pp\" (UniqueName: \"kubernetes.io/projected/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-kube-api-access-ld5pp\") pod \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " Dec 09 04:48:56 crc kubenswrapper[4766]: I1209 04:48:56.903881 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-scripts\") pod \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\" (UID: \"b4e53d15-c42f-4959-9ed5-ddff8241b3d0\") " Dec 09 04:48:56 crc kubenswrapper[4766]: I1209 04:48:56.908570 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-scripts" (OuterVolumeSpecName: "scripts") pod "b4e53d15-c42f-4959-9ed5-ddff8241b3d0" (UID: "b4e53d15-c42f-4959-9ed5-ddff8241b3d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:56 crc kubenswrapper[4766]: I1209 04:48:56.911314 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-kube-api-access-ld5pp" (OuterVolumeSpecName: "kube-api-access-ld5pp") pod "b4e53d15-c42f-4959-9ed5-ddff8241b3d0" (UID: "b4e53d15-c42f-4959-9ed5-ddff8241b3d0"). InnerVolumeSpecName "kube-api-access-ld5pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:48:56 crc kubenswrapper[4766]: I1209 04:48:56.927914 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4e53d15-c42f-4959-9ed5-ddff8241b3d0" (UID: "b4e53d15-c42f-4959-9ed5-ddff8241b3d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:56 crc kubenswrapper[4766]: I1209 04:48:56.928757 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-config-data" (OuterVolumeSpecName: "config-data") pod "b4e53d15-c42f-4959-9ed5-ddff8241b3d0" (UID: "b4e53d15-c42f-4959-9ed5-ddff8241b3d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.005718 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.005751 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.005761 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.005773 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld5pp\" (UniqueName: \"kubernetes.io/projected/b4e53d15-c42f-4959-9ed5-ddff8241b3d0-kube-api-access-ld5pp\") on node \"crc\" DevicePath \"\"" Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.355898 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7c8md" event={"ID":"b4e53d15-c42f-4959-9ed5-ddff8241b3d0","Type":"ContainerDied","Data":"973a5135363e42557d72de87c71619154099bf3d76f7a044e0419fea3c5a53b1"} Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.355948 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="973a5135363e42557d72de87c71619154099bf3d76f7a044e0419fea3c5a53b1" Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.355918 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7c8md" Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.531112 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.532364 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-api" containerID="cri-o://31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489" gracePeriod=30 Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.532312 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-log" containerID="cri-o://4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c" gracePeriod=30 Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.594335 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.594634 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="83885030-1eda-4c1e-954f-6793d46f62f0" containerName="nova-scheduler-scheduler" containerID="cri-o://da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e" gracePeriod=30 Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.618108 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.618343 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-log" containerID="cri-o://bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2" gracePeriod=30 Dec 09 04:48:57 crc kubenswrapper[4766]: I1209 04:48:57.618724 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-metadata" containerID="cri-o://41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f" gracePeriod=30 Dec 09 04:48:58 crc kubenswrapper[4766]: I1209 04:48:58.368450 4766 generic.go:334] "Generic (PLEG): container finished" podID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerID="bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2" exitCode=143 Dec 09 04:48:58 crc kubenswrapper[4766]: I1209 04:48:58.368530 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9090014c-4a18-45da-8f15-7d3f8ae49bc7","Type":"ContainerDied","Data":"bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2"} Dec 09 04:48:58 crc kubenswrapper[4766]: I1209 04:48:58.370673 4766 generic.go:334] "Generic (PLEG): container finished" podID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerID="4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c" exitCode=143 Dec 09 04:48:58 crc kubenswrapper[4766]: I1209 04:48:58.370710 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0be9aacd-c83f-4fb4-9355-9d9e67816128","Type":"ContainerDied","Data":"4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c"} Dec 09 04:48:58 crc kubenswrapper[4766]: E1209 04:48:58.720092 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 04:48:58 crc kubenswrapper[4766]: E1209 04:48:58.721389 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 04:48:58 crc kubenswrapper[4766]: E1209 04:48:58.722714 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 09 04:48:58 crc kubenswrapper[4766]: E1209 04:48:58.722766 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="83885030-1eda-4c1e-954f-6793d46f62f0" containerName="nova-scheduler-scheduler" Dec 09 04:49:00 crc kubenswrapper[4766]: I1209 04:49:00.928249 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.071724 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.084492 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hhr9\" (UniqueName: \"kubernetes.io/projected/83885030-1eda-4c1e-954f-6793d46f62f0-kube-api-access-4hhr9\") pod \"83885030-1eda-4c1e-954f-6793d46f62f0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.084552 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-combined-ca-bundle\") pod \"83885030-1eda-4c1e-954f-6793d46f62f0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.084617 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-config-data\") pod \"83885030-1eda-4c1e-954f-6793d46f62f0\" (UID: \"83885030-1eda-4c1e-954f-6793d46f62f0\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.094077 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83885030-1eda-4c1e-954f-6793d46f62f0-kube-api-access-4hhr9" (OuterVolumeSpecName: "kube-api-access-4hhr9") pod "83885030-1eda-4c1e-954f-6793d46f62f0" (UID: "83885030-1eda-4c1e-954f-6793d46f62f0"). InnerVolumeSpecName "kube-api-access-4hhr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.115385 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83885030-1eda-4c1e-954f-6793d46f62f0" (UID: "83885030-1eda-4c1e-954f-6793d46f62f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.123400 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-config-data" (OuterVolumeSpecName: "config-data") pod "83885030-1eda-4c1e-954f-6793d46f62f0" (UID: "83885030-1eda-4c1e-954f-6793d46f62f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.160255 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.189703 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9aacd-c83f-4fb4-9355-9d9e67816128-logs\") pod \"0be9aacd-c83f-4fb4-9355-9d9e67816128\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.189805 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48wr5\" (UniqueName: \"kubernetes.io/projected/0be9aacd-c83f-4fb4-9355-9d9e67816128-kube-api-access-48wr5\") pod \"0be9aacd-c83f-4fb4-9355-9d9e67816128\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.189968 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-config-data\") pod \"0be9aacd-c83f-4fb4-9355-9d9e67816128\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.190024 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-combined-ca-bundle\") pod \"0be9aacd-c83f-4fb4-9355-9d9e67816128\" (UID: \"0be9aacd-c83f-4fb4-9355-9d9e67816128\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.190989 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.191009 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hhr9\" (UniqueName: \"kubernetes.io/projected/83885030-1eda-4c1e-954f-6793d46f62f0-kube-api-access-4hhr9\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.191026 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83885030-1eda-4c1e-954f-6793d46f62f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.196765 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be9aacd-c83f-4fb4-9355-9d9e67816128-kube-api-access-48wr5" (OuterVolumeSpecName: "kube-api-access-48wr5") pod "0be9aacd-c83f-4fb4-9355-9d9e67816128" (UID: "0be9aacd-c83f-4fb4-9355-9d9e67816128"). InnerVolumeSpecName "kube-api-access-48wr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.197407 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be9aacd-c83f-4fb4-9355-9d9e67816128-logs" (OuterVolumeSpecName: "logs") pod "0be9aacd-c83f-4fb4-9355-9d9e67816128" (UID: "0be9aacd-c83f-4fb4-9355-9d9e67816128"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.220608 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0be9aacd-c83f-4fb4-9355-9d9e67816128" (UID: "0be9aacd-c83f-4fb4-9355-9d9e67816128"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.245936 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-config-data" (OuterVolumeSpecName: "config-data") pod "0be9aacd-c83f-4fb4-9355-9d9e67816128" (UID: "0be9aacd-c83f-4fb4-9355-9d9e67816128"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.292641 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmrv6\" (UniqueName: \"kubernetes.io/projected/9090014c-4a18-45da-8f15-7d3f8ae49bc7-kube-api-access-cmrv6\") pod \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.292852 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-combined-ca-bundle\") pod \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.292942 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-config-data\") pod \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.292971 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9090014c-4a18-45da-8f15-7d3f8ae49bc7-logs\") pod \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\" (UID: \"9090014c-4a18-45da-8f15-7d3f8ae49bc7\") " Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.293411 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be9aacd-c83f-4fb4-9355-9d9e67816128-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.293444 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48wr5\" (UniqueName: \"kubernetes.io/projected/0be9aacd-c83f-4fb4-9355-9d9e67816128-kube-api-access-48wr5\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.293459 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.293469 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be9aacd-c83f-4fb4-9355-9d9e67816128-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.293568 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9090014c-4a18-45da-8f15-7d3f8ae49bc7-logs" (OuterVolumeSpecName: "logs") pod "9090014c-4a18-45da-8f15-7d3f8ae49bc7" (UID: "9090014c-4a18-45da-8f15-7d3f8ae49bc7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.295493 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9090014c-4a18-45da-8f15-7d3f8ae49bc7-kube-api-access-cmrv6" (OuterVolumeSpecName: "kube-api-access-cmrv6") pod "9090014c-4a18-45da-8f15-7d3f8ae49bc7" (UID: "9090014c-4a18-45da-8f15-7d3f8ae49bc7"). InnerVolumeSpecName "kube-api-access-cmrv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.314811 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9090014c-4a18-45da-8f15-7d3f8ae49bc7" (UID: "9090014c-4a18-45da-8f15-7d3f8ae49bc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.322445 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-config-data" (OuterVolumeSpecName: "config-data") pod "9090014c-4a18-45da-8f15-7d3f8ae49bc7" (UID: "9090014c-4a18-45da-8f15-7d3f8ae49bc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.395146 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.395192 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9090014c-4a18-45da-8f15-7d3f8ae49bc7-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.395205 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9090014c-4a18-45da-8f15-7d3f8ae49bc7-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.395277 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmrv6\" (UniqueName: \"kubernetes.io/projected/9090014c-4a18-45da-8f15-7d3f8ae49bc7-kube-api-access-cmrv6\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.401388 4766 generic.go:334] "Generic (PLEG): container finished" podID="83885030-1eda-4c1e-954f-6793d46f62f0" containerID="da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e" exitCode=0 Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.401468 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.401496 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"83885030-1eda-4c1e-954f-6793d46f62f0","Type":"ContainerDied","Data":"da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e"} Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.401565 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"83885030-1eda-4c1e-954f-6793d46f62f0","Type":"ContainerDied","Data":"fd2220d6a104d320ef651e4595368f48fd77d42a426c8fd0e0d57d2b94f2924a"} Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.401583 4766 scope.go:117] "RemoveContainer" containerID="da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.404023 4766 generic.go:334] "Generic (PLEG): container finished" podID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerID="41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f" exitCode=0 Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.404131 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9090014c-4a18-45da-8f15-7d3f8ae49bc7","Type":"ContainerDied","Data":"41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f"} Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.404184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9090014c-4a18-45da-8f15-7d3f8ae49bc7","Type":"ContainerDied","Data":"c8e63bc85dd3fc10c6344378c41784c7c597d0cff99836f404994eaa8a961b20"} Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.404376 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.407912 4766 generic.go:334] "Generic (PLEG): container finished" podID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerID="31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489" exitCode=0 Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.407942 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0be9aacd-c83f-4fb4-9355-9d9e67816128","Type":"ContainerDied","Data":"31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489"} Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.407960 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0be9aacd-c83f-4fb4-9355-9d9e67816128","Type":"ContainerDied","Data":"381321f457543ed6f705e1fb529950aa74ed84798cca5e365e7d1d3bbcec8f7c"} Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.408046 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.431879 4766 scope.go:117] "RemoveContainer" containerID="da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.434987 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e\": container with ID starting with da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e not found: ID does not exist" containerID="da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.435024 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e"} err="failed to get container status \"da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e\": rpc error: code = NotFound desc = could not find container \"da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e\": container with ID starting with da0cff7cff1fdef8dd994ac5091582956db1f5009ddd6ed052b0595f63c3e75e not found: ID does not exist" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.435054 4766 scope.go:117] "RemoveContainer" containerID="41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.439660 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.456454 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.472818 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.473330 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83885030-1eda-4c1e-954f-6793d46f62f0" containerName="nova-scheduler-scheduler" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473353 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="83885030-1eda-4c1e-954f-6793d46f62f0" containerName="nova-scheduler-scheduler" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.473372 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e53d15-c42f-4959-9ed5-ddff8241b3d0" containerName="nova-manage" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473381 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e53d15-c42f-4959-9ed5-ddff8241b3d0" containerName="nova-manage" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.473395 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-metadata" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473403 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-metadata" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.473422 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-api" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473429 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-api" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.473446 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-log" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473454 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-log" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.473472 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-log" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473479 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-log" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473660 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-log" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473668 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="83885030-1eda-4c1e-954f-6793d46f62f0" containerName="nova-scheduler-scheduler" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473678 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" containerName="nova-metadata-metadata" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473692 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e53d15-c42f-4959-9ed5-ddff8241b3d0" containerName="nova-manage" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473702 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-api" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.473712 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" containerName="nova-api-log" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.474447 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.476411 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.506422 4766 scope.go:117] "RemoveContainer" containerID="bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.509694 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.540538 4766 scope.go:117] "RemoveContainer" containerID="41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.541088 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f\": container with ID starting with 41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f not found: ID does not exist" containerID="41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.541146 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f"} err="failed to get container status \"41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f\": rpc error: code = NotFound desc = could not find container \"41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f\": container with ID starting with 41c3715ede509e79066813572a83dd7c20bfa832eaeef164b2f5ed22b69d669f not found: ID does not exist" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.541182 4766 scope.go:117] "RemoveContainer" containerID="bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.541606 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2\": container with ID starting with bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2 not found: ID does not exist" containerID="bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.541647 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2"} err="failed to get container status \"bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2\": rpc error: code = NotFound desc = could not find container \"bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2\": container with ID starting with bb964c64546f4f0638938f2aa21d0c0030446afa88083671b6a897ea3cfe77b2 not found: ID does not exist" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.541676 4766 scope.go:117] "RemoveContainer" containerID="31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.552871 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.561798 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.570394 4766 scope.go:117] "RemoveContainer" containerID="4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.573651 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.575110 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.584505 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.592186 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.597152 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.598621 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.598691 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dwsl\" (UniqueName: \"kubernetes.io/projected/f8963e45-aabf-4f68-bf91-d541dda121d9-kube-api-access-5dwsl\") pod \"nova-scheduler-0\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.598721 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-config-data\") pod \"nova-scheduler-0\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.603047 4766 scope.go:117] "RemoveContainer" containerID="31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.603520 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489\": container with ID starting with 31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489 not found: ID does not exist" containerID="31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.603556 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489"} err="failed to get container status \"31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489\": rpc error: code = NotFound desc = could not find container \"31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489\": container with ID starting with 31671ba01712200471f6f464996ca8e4df48041ceab8764a620e9d100bdcf489 not found: ID does not exist" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.603583 4766 scope.go:117] "RemoveContainer" containerID="4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c" Dec 09 04:49:01 crc kubenswrapper[4766]: E1209 04:49:01.604081 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c\": container with ID starting with 4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c not found: ID does not exist" containerID="4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.604130 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c"} err="failed to get container status \"4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c\": rpc error: code = NotFound desc = could not find container \"4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c\": container with ID starting with 4151e494d69a41ee2c78c4e9516246e35276139a177be3dfcb3804f4abddf67c not found: ID does not exist" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.606804 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.616842 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.618323 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.620533 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.624319 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700089 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700176 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-config-data\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700252 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-config-data\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700343 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c12d527-b20c-4b42-9106-6398d9f1681b-logs\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700408 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkwfp\" (UniqueName: \"kubernetes.io/projected/3c12d527-b20c-4b42-9106-6398d9f1681b-kube-api-access-rkwfp\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700449 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dwsl\" (UniqueName: \"kubernetes.io/projected/f8963e45-aabf-4f68-bf91-d541dda121d9-kube-api-access-5dwsl\") pod \"nova-scheduler-0\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700648 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700741 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-config-data\") pod \"nova-scheduler-0\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700829 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3794f4f9-a807-42c6-8bb0-f00fab8e7993-logs\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.700979 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.701058 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fwr7\" (UniqueName: \"kubernetes.io/projected/3794f4f9-a807-42c6-8bb0-f00fab8e7993-kube-api-access-8fwr7\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.704897 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.713824 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-config-data\") pod \"nova-scheduler-0\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.716317 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dwsl\" (UniqueName: \"kubernetes.io/projected/f8963e45-aabf-4f68-bf91-d541dda121d9-kube-api-access-5dwsl\") pod \"nova-scheduler-0\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.798148 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.804395 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c12d527-b20c-4b42-9106-6398d9f1681b-logs\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.804470 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkwfp\" (UniqueName: \"kubernetes.io/projected/3c12d527-b20c-4b42-9106-6398d9f1681b-kube-api-access-rkwfp\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.804512 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.804545 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3794f4f9-a807-42c6-8bb0-f00fab8e7993-logs\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.804617 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.804662 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fwr7\" (UniqueName: \"kubernetes.io/projected/3794f4f9-a807-42c6-8bb0-f00fab8e7993-kube-api-access-8fwr7\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.804688 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-config-data\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.804705 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c12d527-b20c-4b42-9106-6398d9f1681b-logs\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.804709 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-config-data\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.806076 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3794f4f9-a807-42c6-8bb0-f00fab8e7993-logs\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.808763 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.810469 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-config-data\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.811750 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.826772 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fwr7\" (UniqueName: \"kubernetes.io/projected/3794f4f9-a807-42c6-8bb0-f00fab8e7993-kube-api-access-8fwr7\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.828008 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-config-data\") pod \"nova-metadata-0\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " pod="openstack/nova-metadata-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.834582 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkwfp\" (UniqueName: \"kubernetes.io/projected/3c12d527-b20c-4b42-9106-6398d9f1681b-kube-api-access-rkwfp\") pod \"nova-api-0\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.892920 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:49:01 crc kubenswrapper[4766]: I1209 04:49:01.938416 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.244232 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.366101 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.441413 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.445477 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c12d527-b20c-4b42-9106-6398d9f1681b","Type":"ContainerStarted","Data":"4dbb937d53ff41173ad98a79c7b5dc94caad84960ad41eb66c816bceaf17db69"} Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.456136 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8963e45-aabf-4f68-bf91-d541dda121d9","Type":"ContainerStarted","Data":"ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996"} Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.456195 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8963e45-aabf-4f68-bf91-d541dda121d9","Type":"ContainerStarted","Data":"bce06c4d77064a504f5717052cc00b13443d12f5c077b7ff450807fc1a27383c"} Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.476138 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.4761215380000001 podStartE2EDuration="1.476121538s" podCreationTimestamp="2025-12-09 04:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:49:02.473807585 +0000 UTC m=+5824.183113011" watchObservedRunningTime="2025-12-09 04:49:02.476121538 +0000 UTC m=+5824.185426964" Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.848605 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be9aacd-c83f-4fb4-9355-9d9e67816128" path="/var/lib/kubelet/pods/0be9aacd-c83f-4fb4-9355-9d9e67816128/volumes" Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.849431 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83885030-1eda-4c1e-954f-6793d46f62f0" path="/var/lib/kubelet/pods/83885030-1eda-4c1e-954f-6793d46f62f0/volumes" Dec 09 04:49:02 crc kubenswrapper[4766]: I1209 04:49:02.849958 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9090014c-4a18-45da-8f15-7d3f8ae49bc7" path="/var/lib/kubelet/pods/9090014c-4a18-45da-8f15-7d3f8ae49bc7/volumes" Dec 09 04:49:03 crc kubenswrapper[4766]: I1209 04:49:03.485922 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c12d527-b20c-4b42-9106-6398d9f1681b","Type":"ContainerStarted","Data":"5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270"} Dec 09 04:49:03 crc kubenswrapper[4766]: I1209 04:49:03.485969 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c12d527-b20c-4b42-9106-6398d9f1681b","Type":"ContainerStarted","Data":"fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e"} Dec 09 04:49:03 crc kubenswrapper[4766]: I1209 04:49:03.509079 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3794f4f9-a807-42c6-8bb0-f00fab8e7993","Type":"ContainerStarted","Data":"b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3"} Dec 09 04:49:03 crc kubenswrapper[4766]: I1209 04:49:03.509426 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3794f4f9-a807-42c6-8bb0-f00fab8e7993","Type":"ContainerStarted","Data":"c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c"} Dec 09 04:49:03 crc kubenswrapper[4766]: I1209 04:49:03.509439 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3794f4f9-a807-42c6-8bb0-f00fab8e7993","Type":"ContainerStarted","Data":"02a9f96b0ebe14d9394bb7fd4515181f024082cd110c104e53d2eefbfabe4185"} Dec 09 04:49:03 crc kubenswrapper[4766]: I1209 04:49:03.543399 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.543384342 podStartE2EDuration="2.543384342s" podCreationTimestamp="2025-12-09 04:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:49:03.511642594 +0000 UTC m=+5825.220948040" watchObservedRunningTime="2025-12-09 04:49:03.543384342 +0000 UTC m=+5825.252689768" Dec 09 04:49:03 crc kubenswrapper[4766]: I1209 04:49:03.559470 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.559451806 podStartE2EDuration="2.559451806s" podCreationTimestamp="2025-12-09 04:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:49:03.54072014 +0000 UTC m=+5825.250025566" watchObservedRunningTime="2025-12-09 04:49:03.559451806 +0000 UTC m=+5825.268757222" Dec 09 04:49:04 crc kubenswrapper[4766]: I1209 04:49:04.840475 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:49:04 crc kubenswrapper[4766]: E1209 04:49:04.840769 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:49:06 crc kubenswrapper[4766]: I1209 04:49:06.798802 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 04:49:06 crc kubenswrapper[4766]: I1209 04:49:06.938455 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 04:49:06 crc kubenswrapper[4766]: I1209 04:49:06.938500 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 04:49:11 crc kubenswrapper[4766]: I1209 04:49:11.798568 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 04:49:11 crc kubenswrapper[4766]: I1209 04:49:11.832715 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 04:49:11 crc kubenswrapper[4766]: I1209 04:49:11.894421 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 04:49:11 crc kubenswrapper[4766]: I1209 04:49:11.894473 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 04:49:11 crc kubenswrapper[4766]: I1209 04:49:11.938731 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 04:49:11 crc kubenswrapper[4766]: I1209 04:49:11.938809 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 04:49:12 crc kubenswrapper[4766]: I1209 04:49:12.650955 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 04:49:12 crc kubenswrapper[4766]: I1209 04:49:12.978402 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:49:12 crc kubenswrapper[4766]: I1209 04:49:12.978458 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:49:13 crc kubenswrapper[4766]: I1209 04:49:13.060456 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:49:13 crc kubenswrapper[4766]: I1209 04:49:13.060501 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:49:14 crc kubenswrapper[4766]: I1209 04:49:14.462145 4766 scope.go:117] "RemoveContainer" containerID="9fa5b938b00031f6434c986a3e0f3db01ac47d9a1e2e840a7ec7f74c073db07c" Dec 09 04:49:15 crc kubenswrapper[4766]: I1209 04:49:15.839394 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:49:15 crc kubenswrapper[4766]: E1209 04:49:15.839927 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:49:21 crc kubenswrapper[4766]: I1209 04:49:21.897524 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 04:49:21 crc kubenswrapper[4766]: I1209 04:49:21.898111 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 04:49:21 crc kubenswrapper[4766]: I1209 04:49:21.898433 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 04:49:21 crc kubenswrapper[4766]: I1209 04:49:21.898456 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 04:49:21 crc kubenswrapper[4766]: I1209 04:49:21.900866 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 04:49:21 crc kubenswrapper[4766]: I1209 04:49:21.902095 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 04:49:21 crc kubenswrapper[4766]: I1209 04:49:21.941145 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 04:49:21 crc kubenswrapper[4766]: I1209 04:49:21.941702 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 04:49:21 crc kubenswrapper[4766]: I1209 04:49:21.942710 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.115806 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bd97897d9-4c5h9"] Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.117330 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.138740 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bd97897d9-4c5h9"] Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.220598 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.220876 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-dns-svc\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.221045 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9rk\" (UniqueName: \"kubernetes.io/projected/20bf1448-5c36-4c0c-8720-d6b6764c3997-kube-api-access-7f9rk\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.221129 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-config\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.221165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.322837 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.322961 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-dns-svc\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.323014 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9rk\" (UniqueName: \"kubernetes.io/projected/20bf1448-5c36-4c0c-8720-d6b6764c3997-kube-api-access-7f9rk\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.323054 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-config\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.323073 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.324620 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-nb\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.324621 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-config\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.324722 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-dns-svc\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.324949 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-sb\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.345480 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9rk\" (UniqueName: \"kubernetes.io/projected/20bf1448-5c36-4c0c-8720-d6b6764c3997-kube-api-access-7f9rk\") pod \"dnsmasq-dns-6bd97897d9-4c5h9\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.438953 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.708547 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 04:49:22 crc kubenswrapper[4766]: I1209 04:49:22.983652 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bd97897d9-4c5h9"] Dec 09 04:49:22 crc kubenswrapper[4766]: W1209 04:49:22.988995 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20bf1448_5c36_4c0c_8720_d6b6764c3997.slice/crio-d8c128aa3241c695ce94bc221ff1aa1d689b1803252622ddeadeb9b7754e12e4 WatchSource:0}: Error finding container d8c128aa3241c695ce94bc221ff1aa1d689b1803252622ddeadeb9b7754e12e4: Status 404 returned error can't find the container with id d8c128aa3241c695ce94bc221ff1aa1d689b1803252622ddeadeb9b7754e12e4 Dec 09 04:49:23 crc kubenswrapper[4766]: I1209 04:49:23.709907 4766 generic.go:334] "Generic (PLEG): container finished" podID="20bf1448-5c36-4c0c-8720-d6b6764c3997" containerID="8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55" exitCode=0 Dec 09 04:49:23 crc kubenswrapper[4766]: I1209 04:49:23.711304 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" event={"ID":"20bf1448-5c36-4c0c-8720-d6b6764c3997","Type":"ContainerDied","Data":"8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55"} Dec 09 04:49:23 crc kubenswrapper[4766]: I1209 04:49:23.711353 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" event={"ID":"20bf1448-5c36-4c0c-8720-d6b6764c3997","Type":"ContainerStarted","Data":"d8c128aa3241c695ce94bc221ff1aa1d689b1803252622ddeadeb9b7754e12e4"} Dec 09 04:49:24 crc kubenswrapper[4766]: I1209 04:49:24.720221 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" event={"ID":"20bf1448-5c36-4c0c-8720-d6b6764c3997","Type":"ContainerStarted","Data":"f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e"} Dec 09 04:49:24 crc kubenswrapper[4766]: I1209 04:49:24.747666 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" podStartSLOduration=2.747646877 podStartE2EDuration="2.747646877s" podCreationTimestamp="2025-12-09 04:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:49:24.738928842 +0000 UTC m=+5846.448234288" watchObservedRunningTime="2025-12-09 04:49:24.747646877 +0000 UTC m=+5846.456952303" Dec 09 04:49:25 crc kubenswrapper[4766]: I1209 04:49:25.727857 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:26 crc kubenswrapper[4766]: I1209 04:49:26.839507 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:49:26 crc kubenswrapper[4766]: E1209 04:49:26.840075 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:49:32 crc kubenswrapper[4766]: I1209 04:49:32.440430 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:32 crc kubenswrapper[4766]: I1209 04:49:32.496711 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ff4dfdf57-42jzr"] Dec 09 04:49:32 crc kubenswrapper[4766]: I1209 04:49:32.497551 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" podUID="e2f2552e-146e-4ba2-aa1e-b1a671255e30" containerName="dnsmasq-dns" containerID="cri-o://c1b58114885327f5c5843ea99009e000dc7f0dbc304a1e8df15ab465ab4fa1cb" gracePeriod=10 Dec 09 04:49:32 crc kubenswrapper[4766]: I1209 04:49:32.824561 4766 generic.go:334] "Generic (PLEG): container finished" podID="e2f2552e-146e-4ba2-aa1e-b1a671255e30" containerID="c1b58114885327f5c5843ea99009e000dc7f0dbc304a1e8df15ab465ab4fa1cb" exitCode=0 Dec 09 04:49:32 crc kubenswrapper[4766]: I1209 04:49:32.824886 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" event={"ID":"e2f2552e-146e-4ba2-aa1e-b1a671255e30","Type":"ContainerDied","Data":"c1b58114885327f5c5843ea99009e000dc7f0dbc304a1e8df15ab465ab4fa1cb"} Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.010549 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.176885 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-dns-svc\") pod \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.177041 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-nb\") pod \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.177128 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-sb\") pod \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.177375 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz2vg\" (UniqueName: \"kubernetes.io/projected/e2f2552e-146e-4ba2-aa1e-b1a671255e30-kube-api-access-qz2vg\") pod \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.177460 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-config\") pod \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\" (UID: \"e2f2552e-146e-4ba2-aa1e-b1a671255e30\") " Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.191399 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f2552e-146e-4ba2-aa1e-b1a671255e30-kube-api-access-qz2vg" (OuterVolumeSpecName: "kube-api-access-qz2vg") pod "e2f2552e-146e-4ba2-aa1e-b1a671255e30" (UID: "e2f2552e-146e-4ba2-aa1e-b1a671255e30"). InnerVolumeSpecName "kube-api-access-qz2vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.226079 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e2f2552e-146e-4ba2-aa1e-b1a671255e30" (UID: "e2f2552e-146e-4ba2-aa1e-b1a671255e30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.230819 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2f2552e-146e-4ba2-aa1e-b1a671255e30" (UID: "e2f2552e-146e-4ba2-aa1e-b1a671255e30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.235313 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-config" (OuterVolumeSpecName: "config") pod "e2f2552e-146e-4ba2-aa1e-b1a671255e30" (UID: "e2f2552e-146e-4ba2-aa1e-b1a671255e30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.240409 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2f2552e-146e-4ba2-aa1e-b1a671255e30" (UID: "e2f2552e-146e-4ba2-aa1e-b1a671255e30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.280602 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz2vg\" (UniqueName: \"kubernetes.io/projected/e2f2552e-146e-4ba2-aa1e-b1a671255e30-kube-api-access-qz2vg\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.280641 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.280652 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.280664 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.280678 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2f2552e-146e-4ba2-aa1e-b1a671255e30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.839600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" event={"ID":"e2f2552e-146e-4ba2-aa1e-b1a671255e30","Type":"ContainerDied","Data":"e932d83c23402177df2c21f741ea1942d157cca4bc99441e3f4e6f16ab8f2ff4"} Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.839876 4766 scope.go:117] "RemoveContainer" containerID="c1b58114885327f5c5843ea99009e000dc7f0dbc304a1e8df15ab465ab4fa1cb" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.840072 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ff4dfdf57-42jzr" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.865617 4766 scope.go:117] "RemoveContainer" containerID="25a50d61a5c5e21e089ed4921f38b661893d32c846f68f48a1157866ff620c10" Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.883699 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ff4dfdf57-42jzr"] Dec 09 04:49:33 crc kubenswrapper[4766]: I1209 04:49:33.894144 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ff4dfdf57-42jzr"] Dec 09 04:49:34 crc kubenswrapper[4766]: I1209 04:49:34.850353 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f2552e-146e-4ba2-aa1e-b1a671255e30" path="/var/lib/kubelet/pods/e2f2552e-146e-4ba2-aa1e-b1a671255e30/volumes" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.784711 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qxppj"] Dec 09 04:49:35 crc kubenswrapper[4766]: E1209 04:49:35.785489 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f2552e-146e-4ba2-aa1e-b1a671255e30" containerName="dnsmasq-dns" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.785513 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f2552e-146e-4ba2-aa1e-b1a671255e30" containerName="dnsmasq-dns" Dec 09 04:49:35 crc kubenswrapper[4766]: E1209 04:49:35.785541 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f2552e-146e-4ba2-aa1e-b1a671255e30" containerName="init" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.785550 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f2552e-146e-4ba2-aa1e-b1a671255e30" containerName="init" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.785789 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f2552e-146e-4ba2-aa1e-b1a671255e30" containerName="dnsmasq-dns" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.786590 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.794794 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a499-account-create-update-gm95l"] Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.796393 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.798675 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.803997 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qxppj"] Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.813555 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a499-account-create-update-gm95l"] Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.927437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd5pv\" (UniqueName: \"kubernetes.io/projected/fbbece1b-ed34-4636-b968-3adf08d1becb-kube-api-access-pd5pv\") pod \"cinder-db-create-qxppj\" (UID: \"fbbece1b-ed34-4636-b968-3adf08d1becb\") " pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.928548 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kvmc\" (UniqueName: \"kubernetes.io/projected/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-kube-api-access-2kvmc\") pod \"cinder-a499-account-create-update-gm95l\" (UID: \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\") " pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.928587 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-operator-scripts\") pod \"cinder-a499-account-create-update-gm95l\" (UID: \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\") " pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:35 crc kubenswrapper[4766]: I1209 04:49:35.928683 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbece1b-ed34-4636-b968-3adf08d1becb-operator-scripts\") pod \"cinder-db-create-qxppj\" (UID: \"fbbece1b-ed34-4636-b968-3adf08d1becb\") " pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.031037 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kvmc\" (UniqueName: \"kubernetes.io/projected/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-kube-api-access-2kvmc\") pod \"cinder-a499-account-create-update-gm95l\" (UID: \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\") " pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.031096 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-operator-scripts\") pod \"cinder-a499-account-create-update-gm95l\" (UID: \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\") " pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.031167 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbece1b-ed34-4636-b968-3adf08d1becb-operator-scripts\") pod \"cinder-db-create-qxppj\" (UID: \"fbbece1b-ed34-4636-b968-3adf08d1becb\") " pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.031237 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd5pv\" (UniqueName: \"kubernetes.io/projected/fbbece1b-ed34-4636-b968-3adf08d1becb-kube-api-access-pd5pv\") pod \"cinder-db-create-qxppj\" (UID: \"fbbece1b-ed34-4636-b968-3adf08d1becb\") " pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.032139 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-operator-scripts\") pod \"cinder-a499-account-create-update-gm95l\" (UID: \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\") " pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.032545 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbece1b-ed34-4636-b968-3adf08d1becb-operator-scripts\") pod \"cinder-db-create-qxppj\" (UID: \"fbbece1b-ed34-4636-b968-3adf08d1becb\") " pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.055388 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd5pv\" (UniqueName: \"kubernetes.io/projected/fbbece1b-ed34-4636-b968-3adf08d1becb-kube-api-access-pd5pv\") pod \"cinder-db-create-qxppj\" (UID: \"fbbece1b-ed34-4636-b968-3adf08d1becb\") " pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.056162 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kvmc\" (UniqueName: \"kubernetes.io/projected/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-kube-api-access-2kvmc\") pod \"cinder-a499-account-create-update-gm95l\" (UID: \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\") " pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.108743 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.131413 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.673056 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qxppj"] Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.876623 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a499-account-create-update-gm95l"] Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.884125 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qxppj" event={"ID":"fbbece1b-ed34-4636-b968-3adf08d1becb","Type":"ContainerStarted","Data":"aaed8c6d78317aa2aaad76b76827c34eadecef7ee458f511dd79f83afd19990d"} Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.884165 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qxppj" event={"ID":"fbbece1b-ed34-4636-b968-3adf08d1becb","Type":"ContainerStarted","Data":"f9007224ad95ffb9778a1b40e0c6a8c62a21ca2d1cd7eeb84a16d40508f7e0ab"} Dec 09 04:49:36 crc kubenswrapper[4766]: I1209 04:49:36.909316 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-qxppj" podStartSLOduration=1.9092936379999998 podStartE2EDuration="1.909293638s" podCreationTimestamp="2025-12-09 04:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:49:36.902166815 +0000 UTC m=+5858.611472251" watchObservedRunningTime="2025-12-09 04:49:36.909293638 +0000 UTC m=+5858.618599064" Dec 09 04:49:37 crc kubenswrapper[4766]: I1209 04:49:37.893730 4766 generic.go:334] "Generic (PLEG): container finished" podID="fbbece1b-ed34-4636-b968-3adf08d1becb" containerID="aaed8c6d78317aa2aaad76b76827c34eadecef7ee458f511dd79f83afd19990d" exitCode=0 Dec 09 04:49:37 crc kubenswrapper[4766]: I1209 04:49:37.893787 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qxppj" event={"ID":"fbbece1b-ed34-4636-b968-3adf08d1becb","Type":"ContainerDied","Data":"aaed8c6d78317aa2aaad76b76827c34eadecef7ee458f511dd79f83afd19990d"} Dec 09 04:49:37 crc kubenswrapper[4766]: I1209 04:49:37.895867 4766 generic.go:334] "Generic (PLEG): container finished" podID="8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623" containerID="49abd4c8b1442e1413c221ae0f5d36bcb97d7353ebf0e0353c54876f4eb71e1d" exitCode=0 Dec 09 04:49:37 crc kubenswrapper[4766]: I1209 04:49:37.895907 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a499-account-create-update-gm95l" event={"ID":"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623","Type":"ContainerDied","Data":"49abd4c8b1442e1413c221ae0f5d36bcb97d7353ebf0e0353c54876f4eb71e1d"} Dec 09 04:49:37 crc kubenswrapper[4766]: I1209 04:49:37.895936 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a499-account-create-update-gm95l" event={"ID":"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623","Type":"ContainerStarted","Data":"8a4902fc685082943b6e66abeb640180a6d3f13d282336b0ab33a03718fdbcd6"} Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.402101 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.408920 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.601882 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-operator-scripts\") pod \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\" (UID: \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\") " Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.602038 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd5pv\" (UniqueName: \"kubernetes.io/projected/fbbece1b-ed34-4636-b968-3adf08d1becb-kube-api-access-pd5pv\") pod \"fbbece1b-ed34-4636-b968-3adf08d1becb\" (UID: \"fbbece1b-ed34-4636-b968-3adf08d1becb\") " Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.602161 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kvmc\" (UniqueName: \"kubernetes.io/projected/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-kube-api-access-2kvmc\") pod \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\" (UID: \"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623\") " Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.602258 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbece1b-ed34-4636-b968-3adf08d1becb-operator-scripts\") pod \"fbbece1b-ed34-4636-b968-3adf08d1becb\" (UID: \"fbbece1b-ed34-4636-b968-3adf08d1becb\") " Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.602752 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623" (UID: "8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.602794 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbbece1b-ed34-4636-b968-3adf08d1becb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbbece1b-ed34-4636-b968-3adf08d1becb" (UID: "fbbece1b-ed34-4636-b968-3adf08d1becb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.602913 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.602937 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbece1b-ed34-4636-b968-3adf08d1becb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.609496 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-kube-api-access-2kvmc" (OuterVolumeSpecName: "kube-api-access-2kvmc") pod "8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623" (UID: "8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623"). InnerVolumeSpecName "kube-api-access-2kvmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.609587 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbbece1b-ed34-4636-b968-3adf08d1becb-kube-api-access-pd5pv" (OuterVolumeSpecName: "kube-api-access-pd5pv") pod "fbbece1b-ed34-4636-b968-3adf08d1becb" (UID: "fbbece1b-ed34-4636-b968-3adf08d1becb"). InnerVolumeSpecName "kube-api-access-pd5pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.704302 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kvmc\" (UniqueName: \"kubernetes.io/projected/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623-kube-api-access-2kvmc\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.704344 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd5pv\" (UniqueName: \"kubernetes.io/projected/fbbece1b-ed34-4636-b968-3adf08d1becb-kube-api-access-pd5pv\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.918518 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qxppj" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.918510 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qxppj" event={"ID":"fbbece1b-ed34-4636-b968-3adf08d1becb","Type":"ContainerDied","Data":"f9007224ad95ffb9778a1b40e0c6a8c62a21ca2d1cd7eeb84a16d40508f7e0ab"} Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.918671 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9007224ad95ffb9778a1b40e0c6a8c62a21ca2d1cd7eeb84a16d40508f7e0ab" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.920537 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a499-account-create-update-gm95l" event={"ID":"8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623","Type":"ContainerDied","Data":"8a4902fc685082943b6e66abeb640180a6d3f13d282336b0ab33a03718fdbcd6"} Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.920580 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a4902fc685082943b6e66abeb640180a6d3f13d282336b0ab33a03718fdbcd6" Dec 09 04:49:39 crc kubenswrapper[4766]: I1209 04:49:39.920602 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a499-account-create-update-gm95l" Dec 09 04:49:40 crc kubenswrapper[4766]: I1209 04:49:40.839075 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:49:40 crc kubenswrapper[4766]: E1209 04:49:40.839738 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.003017 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xlvns"] Dec 09 04:49:41 crc kubenswrapper[4766]: E1209 04:49:41.003391 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbbece1b-ed34-4636-b968-3adf08d1becb" containerName="mariadb-database-create" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.003407 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbbece1b-ed34-4636-b968-3adf08d1becb" containerName="mariadb-database-create" Dec 09 04:49:41 crc kubenswrapper[4766]: E1209 04:49:41.003652 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623" containerName="mariadb-account-create-update" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.003665 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623" containerName="mariadb-account-create-update" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.003951 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbbece1b-ed34-4636-b968-3adf08d1becb" containerName="mariadb-database-create" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.003995 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623" containerName="mariadb-account-create-update" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.004578 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.006507 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xzfjm" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.006551 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.006665 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.025645 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xlvns"] Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.127080 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-combined-ca-bundle\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.127120 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxcnf\" (UniqueName: \"kubernetes.io/projected/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-kube-api-access-gxcnf\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.127189 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-etc-machine-id\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.127312 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-scripts\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.127445 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-db-sync-config-data\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.127498 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-config-data\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.229851 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxcnf\" (UniqueName: \"kubernetes.io/projected/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-kube-api-access-gxcnf\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.229974 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-etc-machine-id\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.230026 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-scripts\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.230069 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-db-sync-config-data\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.230097 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-config-data\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.230124 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-etc-machine-id\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.230175 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-combined-ca-bundle\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.235571 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-db-sync-config-data\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.235574 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-combined-ca-bundle\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.235723 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-scripts\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.235774 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-config-data\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.245187 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxcnf\" (UniqueName: \"kubernetes.io/projected/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-kube-api-access-gxcnf\") pod \"cinder-db-sync-xlvns\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.326748 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.847625 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xlvns"] Dec 09 04:49:41 crc kubenswrapper[4766]: I1209 04:49:41.946532 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xlvns" event={"ID":"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129","Type":"ContainerStarted","Data":"b777ef19f7ff77e896c3781ef70c5762f5f88823bbb20b81234b9b6e5055fa8d"} Dec 09 04:49:42 crc kubenswrapper[4766]: I1209 04:49:42.957514 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xlvns" event={"ID":"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129","Type":"ContainerStarted","Data":"d5ba375f9aba665c430573120c223f61cd6a72535d41e263a56733cd92f27570"} Dec 09 04:49:42 crc kubenswrapper[4766]: I1209 04:49:42.992644 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xlvns" podStartSLOduration=2.99261905 podStartE2EDuration="2.99261905s" podCreationTimestamp="2025-12-09 04:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:49:42.987619355 +0000 UTC m=+5864.696924791" watchObservedRunningTime="2025-12-09 04:49:42.99261905 +0000 UTC m=+5864.701924476" Dec 09 04:49:44 crc kubenswrapper[4766]: I1209 04:49:44.977684 4766 generic.go:334] "Generic (PLEG): container finished" podID="e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" containerID="d5ba375f9aba665c430573120c223f61cd6a72535d41e263a56733cd92f27570" exitCode=0 Dec 09 04:49:44 crc kubenswrapper[4766]: I1209 04:49:44.977761 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xlvns" event={"ID":"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129","Type":"ContainerDied","Data":"d5ba375f9aba665c430573120c223f61cd6a72535d41e263a56733cd92f27570"} Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.301183 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.425937 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-scripts\") pod \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.426008 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-etc-machine-id\") pod \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.426045 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-db-sync-config-data\") pod \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.426151 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-config-data\") pod \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.426236 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-combined-ca-bundle\") pod \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.426253 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxcnf\" (UniqueName: \"kubernetes.io/projected/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-kube-api-access-gxcnf\") pod \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\" (UID: \"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129\") " Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.427095 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" (UID: "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.431518 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-scripts" (OuterVolumeSpecName: "scripts") pod "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" (UID: "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.431721 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-kube-api-access-gxcnf" (OuterVolumeSpecName: "kube-api-access-gxcnf") pod "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" (UID: "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129"). InnerVolumeSpecName "kube-api-access-gxcnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.432412 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" (UID: "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.455777 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" (UID: "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.483657 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-config-data" (OuterVolumeSpecName: "config-data") pod "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" (UID: "e6e4ae20-163f-4d0a-a59a-4c4e79ec7129"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.528086 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.528119 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.528129 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.528138 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.528147 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:46 crc kubenswrapper[4766]: I1209 04:49:46.528156 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxcnf\" (UniqueName: \"kubernetes.io/projected/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129-kube-api-access-gxcnf\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.001035 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xlvns" event={"ID":"e6e4ae20-163f-4d0a-a59a-4c4e79ec7129","Type":"ContainerDied","Data":"b777ef19f7ff77e896c3781ef70c5762f5f88823bbb20b81234b9b6e5055fa8d"} Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.001437 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b777ef19f7ff77e896c3781ef70c5762f5f88823bbb20b81234b9b6e5055fa8d" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.001159 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xlvns" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.632269 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57b67fc789-stj2f"] Dec 09 04:49:47 crc kubenswrapper[4766]: E1209 04:49:47.632668 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" containerName="cinder-db-sync" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.632680 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" containerName="cinder-db-sync" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.632852 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" containerName="cinder-db-sync" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.633797 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.641879 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b67fc789-stj2f"] Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.751531 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-nb\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.751592 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-config\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.751633 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-dns-svc\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.751653 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-sb\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.751675 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blslr\" (UniqueName: \"kubernetes.io/projected/c7dcac32-7e95-4025-8aee-a985b13b52ef-kube-api-access-blslr\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.852861 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-nb\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.852924 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-config\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.852964 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-dns-svc\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.853002 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-sb\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.853024 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blslr\" (UniqueName: \"kubernetes.io/projected/c7dcac32-7e95-4025-8aee-a985b13b52ef-kube-api-access-blslr\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.854441 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-nb\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.854965 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-config\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.855491 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-dns-svc\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.857011 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-sb\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.877633 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.879178 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.879906 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blslr\" (UniqueName: \"kubernetes.io/projected/c7dcac32-7e95-4025-8aee-a985b13b52ef-kube-api-access-blslr\") pod \"dnsmasq-dns-57b67fc789-stj2f\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.882558 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.882705 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.882902 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.883089 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xzfjm" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.960919 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:47 crc kubenswrapper[4766]: I1209 04:49:47.973359 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.058294 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.058347 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b1fac-7acc-411c-8f77-e80e15ca27ec-logs\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.058374 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.058410 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a15b1fac-7acc-411c-8f77-e80e15ca27ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.058454 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-scripts\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.058535 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.058587 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crxjp\" (UniqueName: \"kubernetes.io/projected/a15b1fac-7acc-411c-8f77-e80e15ca27ec-kube-api-access-crxjp\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.160357 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.160410 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b1fac-7acc-411c-8f77-e80e15ca27ec-logs\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.160432 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.160460 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a15b1fac-7acc-411c-8f77-e80e15ca27ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.160485 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-scripts\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.160568 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.160624 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crxjp\" (UniqueName: \"kubernetes.io/projected/a15b1fac-7acc-411c-8f77-e80e15ca27ec-kube-api-access-crxjp\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.161907 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a15b1fac-7acc-411c-8f77-e80e15ca27ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.162571 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b1fac-7acc-411c-8f77-e80e15ca27ec-logs\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.165639 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-scripts\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.166756 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.173129 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.180310 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.182309 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crxjp\" (UniqueName: \"kubernetes.io/projected/a15b1fac-7acc-411c-8f77-e80e15ca27ec-kube-api-access-crxjp\") pod \"cinder-api-0\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.239938 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.481087 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b67fc789-stj2f"] Dec 09 04:49:48 crc kubenswrapper[4766]: W1209 04:49:48.481529 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7dcac32_7e95_4025_8aee_a985b13b52ef.slice/crio-adc475f353efcb811991a6406bbc06d8afee4b906b3ccc081ff70af4082b3268 WatchSource:0}: Error finding container adc475f353efcb811991a6406bbc06d8afee4b906b3ccc081ff70af4082b3268: Status 404 returned error can't find the container with id adc475f353efcb811991a6406bbc06d8afee4b906b3ccc081ff70af4082b3268 Dec 09 04:49:48 crc kubenswrapper[4766]: I1209 04:49:48.701551 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 04:49:49 crc kubenswrapper[4766]: I1209 04:49:49.035641 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a15b1fac-7acc-411c-8f77-e80e15ca27ec","Type":"ContainerStarted","Data":"6fcd708c5f6050709d7e78251ae8fd329d9f3e08332926e7916d779fc3fa294f"} Dec 09 04:49:49 crc kubenswrapper[4766]: I1209 04:49:49.038674 4766 generic.go:334] "Generic (PLEG): container finished" podID="c7dcac32-7e95-4025-8aee-a985b13b52ef" containerID="dfa46574415340b45d64ad78c3ff1899bbe4efa2724142cac8469bc7840fdbdb" exitCode=0 Dec 09 04:49:49 crc kubenswrapper[4766]: I1209 04:49:49.038705 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" event={"ID":"c7dcac32-7e95-4025-8aee-a985b13b52ef","Type":"ContainerDied","Data":"dfa46574415340b45d64ad78c3ff1899bbe4efa2724142cac8469bc7840fdbdb"} Dec 09 04:49:49 crc kubenswrapper[4766]: I1209 04:49:49.038733 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" event={"ID":"c7dcac32-7e95-4025-8aee-a985b13b52ef","Type":"ContainerStarted","Data":"adc475f353efcb811991a6406bbc06d8afee4b906b3ccc081ff70af4082b3268"} Dec 09 04:49:50 crc kubenswrapper[4766]: I1209 04:49:50.049204 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a15b1fac-7acc-411c-8f77-e80e15ca27ec","Type":"ContainerStarted","Data":"0635a3e8513c85c5c7c986500448a30ac62cbbbb8f1c8a089c82b85256423e18"} Dec 09 04:49:50 crc kubenswrapper[4766]: I1209 04:49:50.049561 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a15b1fac-7acc-411c-8f77-e80e15ca27ec","Type":"ContainerStarted","Data":"93caa0dbdd1b93e2049b64a62e671d41b6260dc67472fbba16bbc7ed01b0a01c"} Dec 09 04:49:50 crc kubenswrapper[4766]: I1209 04:49:50.051024 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 04:49:50 crc kubenswrapper[4766]: I1209 04:49:50.052646 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" event={"ID":"c7dcac32-7e95-4025-8aee-a985b13b52ef","Type":"ContainerStarted","Data":"6fc6b3ef12267c15d96698b8b2c4282895962bbd8a89ebd7e75b696ada87f8a8"} Dec 09 04:49:50 crc kubenswrapper[4766]: I1209 04:49:50.053183 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:50 crc kubenswrapper[4766]: I1209 04:49:50.070628 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.070602764 podStartE2EDuration="3.070602764s" podCreationTimestamp="2025-12-09 04:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:49:50.067329815 +0000 UTC m=+5871.776635251" watchObservedRunningTime="2025-12-09 04:49:50.070602764 +0000 UTC m=+5871.779908200" Dec 09 04:49:50 crc kubenswrapper[4766]: I1209 04:49:50.097742 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" podStartSLOduration=3.097722036 podStartE2EDuration="3.097722036s" podCreationTimestamp="2025-12-09 04:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:49:50.089913036 +0000 UTC m=+5871.799218482" watchObservedRunningTime="2025-12-09 04:49:50.097722036 +0000 UTC m=+5871.807027462" Dec 09 04:49:54 crc kubenswrapper[4766]: I1209 04:49:54.842440 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:49:54 crc kubenswrapper[4766]: E1209 04:49:54.843371 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:49:57 crc kubenswrapper[4766]: I1209 04:49:57.963613 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.048637 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bd97897d9-4c5h9"] Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.048904 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" podUID="20bf1448-5c36-4c0c-8720-d6b6764c3997" containerName="dnsmasq-dns" containerID="cri-o://f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e" gracePeriod=10 Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.550835 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.676304 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-nb\") pod \"20bf1448-5c36-4c0c-8720-d6b6764c3997\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.676395 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-dns-svc\") pod \"20bf1448-5c36-4c0c-8720-d6b6764c3997\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.676429 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-sb\") pod \"20bf1448-5c36-4c0c-8720-d6b6764c3997\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.676531 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-config\") pod \"20bf1448-5c36-4c0c-8720-d6b6764c3997\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.676615 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f9rk\" (UniqueName: \"kubernetes.io/projected/20bf1448-5c36-4c0c-8720-d6b6764c3997-kube-api-access-7f9rk\") pod \"20bf1448-5c36-4c0c-8720-d6b6764c3997\" (UID: \"20bf1448-5c36-4c0c-8720-d6b6764c3997\") " Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.683460 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20bf1448-5c36-4c0c-8720-d6b6764c3997-kube-api-access-7f9rk" (OuterVolumeSpecName: "kube-api-access-7f9rk") pod "20bf1448-5c36-4c0c-8720-d6b6764c3997" (UID: "20bf1448-5c36-4c0c-8720-d6b6764c3997"). InnerVolumeSpecName "kube-api-access-7f9rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.736315 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "20bf1448-5c36-4c0c-8720-d6b6764c3997" (UID: "20bf1448-5c36-4c0c-8720-d6b6764c3997"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.751002 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "20bf1448-5c36-4c0c-8720-d6b6764c3997" (UID: "20bf1448-5c36-4c0c-8720-d6b6764c3997"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.758698 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "20bf1448-5c36-4c0c-8720-d6b6764c3997" (UID: "20bf1448-5c36-4c0c-8720-d6b6764c3997"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.766115 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-config" (OuterVolumeSpecName: "config") pod "20bf1448-5c36-4c0c-8720-d6b6764c3997" (UID: "20bf1448-5c36-4c0c-8720-d6b6764c3997"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.779367 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.779397 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.779426 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.779434 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20bf1448-5c36-4c0c-8720-d6b6764c3997-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:58 crc kubenswrapper[4766]: I1209 04:49:58.779443 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f9rk\" (UniqueName: \"kubernetes.io/projected/20bf1448-5c36-4c0c-8720-d6b6764c3997-kube-api-access-7f9rk\") on node \"crc\" DevicePath \"\"" Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.155548 4766 generic.go:334] "Generic (PLEG): container finished" podID="20bf1448-5c36-4c0c-8720-d6b6764c3997" containerID="f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e" exitCode=0 Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.155902 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" event={"ID":"20bf1448-5c36-4c0c-8720-d6b6764c3997","Type":"ContainerDied","Data":"f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e"} Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.155956 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" event={"ID":"20bf1448-5c36-4c0c-8720-d6b6764c3997","Type":"ContainerDied","Data":"d8c128aa3241c695ce94bc221ff1aa1d689b1803252622ddeadeb9b7754e12e4"} Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.155977 4766 scope.go:117] "RemoveContainer" containerID="f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e" Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.156141 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bd97897d9-4c5h9" Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.213311 4766 scope.go:117] "RemoveContainer" containerID="8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55" Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.219257 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bd97897d9-4c5h9"] Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.229004 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bd97897d9-4c5h9"] Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.244009 4766 scope.go:117] "RemoveContainer" containerID="f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e" Dec 09 04:49:59 crc kubenswrapper[4766]: E1209 04:49:59.244449 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e\": container with ID starting with f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e not found: ID does not exist" containerID="f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e" Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.244486 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e"} err="failed to get container status \"f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e\": rpc error: code = NotFound desc = could not find container \"f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e\": container with ID starting with f0a3ee361ff8b6815dbfd52a91c045965c02d2941901f529b7593feaeb15b08e not found: ID does not exist" Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.244511 4766 scope.go:117] "RemoveContainer" containerID="8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55" Dec 09 04:49:59 crc kubenswrapper[4766]: E1209 04:49:59.244746 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55\": container with ID starting with 8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55 not found: ID does not exist" containerID="8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55" Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.244774 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55"} err="failed to get container status \"8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55\": rpc error: code = NotFound desc = could not find container \"8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55\": container with ID starting with 8217246fadafb0a7e3a1004d7f69f0b8f153dfe434bdc34d0f890031fa374a55 not found: ID does not exist" Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.483283 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.483551 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-log" containerID="cri-o://c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c" gracePeriod=30 Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.484080 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-metadata" containerID="cri-o://b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3" gracePeriod=30 Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.495410 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.495668 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="463cd510-5202-403e-8f31-0e0fe1d54659" containerName="nova-cell0-conductor-conductor" containerID="cri-o://2803d5877db7f73f5e6c9a48d168bb45b5e63292de040cd6a10870bbc31820a5" gracePeriod=30 Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.511272 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.511564 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1b0eac65-3749-47de-baed-8af5751821d4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://93b4178a41e8f4dd192b5955a9da98ea87a6c4a0ca5c33848b9c597a040e0803" gracePeriod=30 Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.534602 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.534856 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-log" containerID="cri-o://fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e" gracePeriod=30 Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.534974 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-api" containerID="cri-o://5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270" gracePeriod=30 Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.545747 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:49:59 crc kubenswrapper[4766]: I1209 04:49:59.545938 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f8963e45-aabf-4f68-bf91-d541dda121d9" containerName="nova-scheduler-scheduler" containerID="cri-o://ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996" gracePeriod=30 Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.041925 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="1b0eac65-3749-47de-baed-8af5751821d4" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.1.58:6080/vnc_lite.html\": dial tcp 10.217.1.58:6080: connect: connection refused" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.185690 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b0eac65-3749-47de-baed-8af5751821d4" containerID="93b4178a41e8f4dd192b5955a9da98ea87a6c4a0ca5c33848b9c597a040e0803" exitCode=0 Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.185779 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b0eac65-3749-47de-baed-8af5751821d4","Type":"ContainerDied","Data":"93b4178a41e8f4dd192b5955a9da98ea87a6c4a0ca5c33848b9c597a040e0803"} Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.191508 4766 generic.go:334] "Generic (PLEG): container finished" podID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerID="fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e" exitCode=143 Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.191568 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c12d527-b20c-4b42-9106-6398d9f1681b","Type":"ContainerDied","Data":"fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e"} Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.195063 4766 generic.go:334] "Generic (PLEG): container finished" podID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerID="c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c" exitCode=143 Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.195159 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3794f4f9-a807-42c6-8bb0-f00fab8e7993","Type":"ContainerDied","Data":"c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c"} Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.213263 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.307840 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.409922 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fdwb\" (UniqueName: \"kubernetes.io/projected/1b0eac65-3749-47de-baed-8af5751821d4-kube-api-access-4fdwb\") pod \"1b0eac65-3749-47de-baed-8af5751821d4\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.410008 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-combined-ca-bundle\") pod \"1b0eac65-3749-47de-baed-8af5751821d4\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.410109 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-config-data\") pod \"1b0eac65-3749-47de-baed-8af5751821d4\" (UID: \"1b0eac65-3749-47de-baed-8af5751821d4\") " Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.423472 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0eac65-3749-47de-baed-8af5751821d4-kube-api-access-4fdwb" (OuterVolumeSpecName: "kube-api-access-4fdwb") pod "1b0eac65-3749-47de-baed-8af5751821d4" (UID: "1b0eac65-3749-47de-baed-8af5751821d4"). InnerVolumeSpecName "kube-api-access-4fdwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.443115 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-config-data" (OuterVolumeSpecName: "config-data") pod "1b0eac65-3749-47de-baed-8af5751821d4" (UID: "1b0eac65-3749-47de-baed-8af5751821d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.461919 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b0eac65-3749-47de-baed-8af5751821d4" (UID: "1b0eac65-3749-47de-baed-8af5751821d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.513337 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fdwb\" (UniqueName: \"kubernetes.io/projected/1b0eac65-3749-47de-baed-8af5751821d4-kube-api-access-4fdwb\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.513375 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.513388 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b0eac65-3749-47de-baed-8af5751821d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.759362 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.817581 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dwsl\" (UniqueName: \"kubernetes.io/projected/f8963e45-aabf-4f68-bf91-d541dda121d9-kube-api-access-5dwsl\") pod \"f8963e45-aabf-4f68-bf91-d541dda121d9\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.817718 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-combined-ca-bundle\") pod \"f8963e45-aabf-4f68-bf91-d541dda121d9\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.818310 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-config-data\") pod \"f8963e45-aabf-4f68-bf91-d541dda121d9\" (UID: \"f8963e45-aabf-4f68-bf91-d541dda121d9\") " Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.821118 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8963e45-aabf-4f68-bf91-d541dda121d9-kube-api-access-5dwsl" (OuterVolumeSpecName: "kube-api-access-5dwsl") pod "f8963e45-aabf-4f68-bf91-d541dda121d9" (UID: "f8963e45-aabf-4f68-bf91-d541dda121d9"). InnerVolumeSpecName "kube-api-access-5dwsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.842609 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-config-data" (OuterVolumeSpecName: "config-data") pod "f8963e45-aabf-4f68-bf91-d541dda121d9" (UID: "f8963e45-aabf-4f68-bf91-d541dda121d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.849022 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20bf1448-5c36-4c0c-8720-d6b6764c3997" path="/var/lib/kubelet/pods/20bf1448-5c36-4c0c-8720-d6b6764c3997/volumes" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.853995 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8963e45-aabf-4f68-bf91-d541dda121d9" (UID: "f8963e45-aabf-4f68-bf91-d541dda121d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.920724 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.920784 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dwsl\" (UniqueName: \"kubernetes.io/projected/f8963e45-aabf-4f68-bf91-d541dda121d9-kube-api-access-5dwsl\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:00 crc kubenswrapper[4766]: I1209 04:50:00.920798 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8963e45-aabf-4f68-bf91-d541dda121d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.216395 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.216415 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b0eac65-3749-47de-baed-8af5751821d4","Type":"ContainerDied","Data":"ce9287e308388089af896bfa36d09bca38ee22afaf5fe69f1192452031c48209"} Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.216491 4766 scope.go:117] "RemoveContainer" containerID="93b4178a41e8f4dd192b5955a9da98ea87a6c4a0ca5c33848b9c597a040e0803" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.218758 4766 generic.go:334] "Generic (PLEG): container finished" podID="f8963e45-aabf-4f68-bf91-d541dda121d9" containerID="ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996" exitCode=0 Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.218799 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8963e45-aabf-4f68-bf91-d541dda121d9","Type":"ContainerDied","Data":"ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996"} Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.218821 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.218827 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8963e45-aabf-4f68-bf91-d541dda121d9","Type":"ContainerDied","Data":"bce06c4d77064a504f5717052cc00b13443d12f5c077b7ff450807fc1a27383c"} Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.284693 4766 scope.go:117] "RemoveContainer" containerID="ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.314514 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.319585 4766 scope.go:117] "RemoveContainer" containerID="ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.335278 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.342977 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.351084 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.355001 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 04:50:01 crc kubenswrapper[4766]: E1209 04:50:01.355464 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20bf1448-5c36-4c0c-8720-d6b6764c3997" containerName="dnsmasq-dns" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.355480 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="20bf1448-5c36-4c0c-8720-d6b6764c3997" containerName="dnsmasq-dns" Dec 09 04:50:01 crc kubenswrapper[4766]: E1209 04:50:01.355494 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0eac65-3749-47de-baed-8af5751821d4" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.355501 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0eac65-3749-47de-baed-8af5751821d4" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 04:50:01 crc kubenswrapper[4766]: E1209 04:50:01.355533 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20bf1448-5c36-4c0c-8720-d6b6764c3997" containerName="init" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.355540 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="20bf1448-5c36-4c0c-8720-d6b6764c3997" containerName="init" Dec 09 04:50:01 crc kubenswrapper[4766]: E1209 04:50:01.355557 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8963e45-aabf-4f68-bf91-d541dda121d9" containerName="nova-scheduler-scheduler" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.355564 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8963e45-aabf-4f68-bf91-d541dda121d9" containerName="nova-scheduler-scheduler" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.355790 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0eac65-3749-47de-baed-8af5751821d4" containerName="nova-cell1-novncproxy-novncproxy" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.355820 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="20bf1448-5c36-4c0c-8720-d6b6764c3997" containerName="dnsmasq-dns" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.355838 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8963e45-aabf-4f68-bf91-d541dda121d9" containerName="nova-scheduler-scheduler" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.356592 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.361185 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 04:50:01 crc kubenswrapper[4766]: E1209 04:50:01.367881 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996\": container with ID starting with ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996 not found: ID does not exist" containerID="ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.367940 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996"} err="failed to get container status \"ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996\": rpc error: code = NotFound desc = could not find container \"ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996\": container with ID starting with ec378436ecc4097ff2581c9367fb0cbac18438c4379ad6f635537f76bbd13996 not found: ID does not exist" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.368334 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.369469 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.371088 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.372290 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.400284 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.445939 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-config-data\") pod \"nova-scheduler-0\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.446030 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.446071 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a97579-3281-4578-9f14-3429d802e69c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"57a97579-3281-4578-9f14-3429d802e69c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.446129 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nl4x\" (UniqueName: \"kubernetes.io/projected/c33cb8e9-0109-4f61-9441-c3a99463eaf4-kube-api-access-7nl4x\") pod \"nova-scheduler-0\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.446268 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a97579-3281-4578-9f14-3429d802e69c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"57a97579-3281-4578-9f14-3429d802e69c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.446308 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj8mt\" (UniqueName: \"kubernetes.io/projected/57a97579-3281-4578-9f14-3429d802e69c-kube-api-access-gj8mt\") pod \"nova-cell1-novncproxy-0\" (UID: \"57a97579-3281-4578-9f14-3429d802e69c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.547695 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a97579-3281-4578-9f14-3429d802e69c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"57a97579-3281-4578-9f14-3429d802e69c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.547753 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nl4x\" (UniqueName: \"kubernetes.io/projected/c33cb8e9-0109-4f61-9441-c3a99463eaf4-kube-api-access-7nl4x\") pod \"nova-scheduler-0\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.547812 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a97579-3281-4578-9f14-3429d802e69c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"57a97579-3281-4578-9f14-3429d802e69c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.547858 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj8mt\" (UniqueName: \"kubernetes.io/projected/57a97579-3281-4578-9f14-3429d802e69c-kube-api-access-gj8mt\") pod \"nova-cell1-novncproxy-0\" (UID: \"57a97579-3281-4578-9f14-3429d802e69c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.547885 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-config-data\") pod \"nova-scheduler-0\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.547922 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.553497 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a97579-3281-4578-9f14-3429d802e69c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"57a97579-3281-4578-9f14-3429d802e69c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.554622 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a97579-3281-4578-9f14-3429d802e69c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"57a97579-3281-4578-9f14-3429d802e69c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.569853 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.572508 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-config-data\") pod \"nova-scheduler-0\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.574996 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj8mt\" (UniqueName: \"kubernetes.io/projected/57a97579-3281-4578-9f14-3429d802e69c-kube-api-access-gj8mt\") pod \"nova-cell1-novncproxy-0\" (UID: \"57a97579-3281-4578-9f14-3429d802e69c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.582267 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nl4x\" (UniqueName: \"kubernetes.io/projected/c33cb8e9-0109-4f61-9441-c3a99463eaf4-kube-api-access-7nl4x\") pod \"nova-scheduler-0\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " pod="openstack/nova-scheduler-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.691165 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:01 crc kubenswrapper[4766]: I1209 04:50:01.701566 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.181169 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 04:50:02 crc kubenswrapper[4766]: W1209 04:50:02.190031 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc33cb8e9_0109_4f61_9441_c3a99463eaf4.slice/crio-72128b0d092c24b31e8ba51926bcdf716aeb680a12b7f9df66dfba98aca7553f WatchSource:0}: Error finding container 72128b0d092c24b31e8ba51926bcdf716aeb680a12b7f9df66dfba98aca7553f: Status 404 returned error can't find the container with id 72128b0d092c24b31e8ba51926bcdf716aeb680a12b7f9df66dfba98aca7553f Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.233529 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c33cb8e9-0109-4f61-9441-c3a99463eaf4","Type":"ContainerStarted","Data":"72128b0d092c24b31e8ba51926bcdf716aeb680a12b7f9df66dfba98aca7553f"} Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.259270 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 09 04:50:02 crc kubenswrapper[4766]: W1209 04:50:02.261636 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57a97579_3281_4578_9f14_3429d802e69c.slice/crio-8d31590545b44ccd5610b207417eac5911838454dbc9d879adc057da8802a46f WatchSource:0}: Error finding container 8d31590545b44ccd5610b207417eac5911838454dbc9d879adc057da8802a46f: Status 404 returned error can't find the container with id 8d31590545b44ccd5610b207417eac5911838454dbc9d879adc057da8802a46f Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.653660 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": read tcp 10.217.0.2:43072->10.217.1.72:8775: read: connection reset by peer" Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.653744 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": read tcp 10.217.0.2:43074->10.217.1.72:8775: read: connection reset by peer" Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.694117 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": read tcp 10.217.0.2:42730->10.217.1.71:8774: read: connection reset by peer" Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.694127 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": read tcp 10.217.0.2:42736->10.217.1.71:8774: read: connection reset by peer" Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.767922 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.768138 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="60a1d753-d329-4f92-a520-aac278c6acd9" containerName="nova-cell1-conductor-conductor" containerID="cri-o://0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff" gracePeriod=30 Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.860047 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0eac65-3749-47de-baed-8af5751821d4" path="/var/lib/kubelet/pods/1b0eac65-3749-47de-baed-8af5751821d4/volumes" Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.860726 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8963e45-aabf-4f68-bf91-d541dda121d9" path="/var/lib/kubelet/pods/f8963e45-aabf-4f68-bf91-d541dda121d9/volumes" Dec 09 04:50:02 crc kubenswrapper[4766]: I1209 04:50:02.997927 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.073920 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-combined-ca-bundle\") pod \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.073994 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-config-data\") pod \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.074178 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3794f4f9-a807-42c6-8bb0-f00fab8e7993-logs\") pod \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.074200 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fwr7\" (UniqueName: \"kubernetes.io/projected/3794f4f9-a807-42c6-8bb0-f00fab8e7993-kube-api-access-8fwr7\") pod \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\" (UID: \"3794f4f9-a807-42c6-8bb0-f00fab8e7993\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.079042 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3794f4f9-a807-42c6-8bb0-f00fab8e7993-logs" (OuterVolumeSpecName: "logs") pod "3794f4f9-a807-42c6-8bb0-f00fab8e7993" (UID: "3794f4f9-a807-42c6-8bb0-f00fab8e7993"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.081676 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3794f4f9-a807-42c6-8bb0-f00fab8e7993-kube-api-access-8fwr7" (OuterVolumeSpecName: "kube-api-access-8fwr7") pod "3794f4f9-a807-42c6-8bb0-f00fab8e7993" (UID: "3794f4f9-a807-42c6-8bb0-f00fab8e7993"). InnerVolumeSpecName "kube-api-access-8fwr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.103401 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-config-data" (OuterVolumeSpecName: "config-data") pod "3794f4f9-a807-42c6-8bb0-f00fab8e7993" (UID: "3794f4f9-a807-42c6-8bb0-f00fab8e7993"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.118122 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3794f4f9-a807-42c6-8bb0-f00fab8e7993" (UID: "3794f4f9-a807-42c6-8bb0-f00fab8e7993"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.169191 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.176089 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3794f4f9-a807-42c6-8bb0-f00fab8e7993-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.176119 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fwr7\" (UniqueName: \"kubernetes.io/projected/3794f4f9-a807-42c6-8bb0-f00fab8e7993-kube-api-access-8fwr7\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.176130 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.176142 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794f4f9-a807-42c6-8bb0-f00fab8e7993-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.254332 4766 generic.go:334] "Generic (PLEG): container finished" podID="463cd510-5202-403e-8f31-0e0fe1d54659" containerID="2803d5877db7f73f5e6c9a48d168bb45b5e63292de040cd6a10870bbc31820a5" exitCode=0 Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.254393 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"463cd510-5202-403e-8f31-0e0fe1d54659","Type":"ContainerDied","Data":"2803d5877db7f73f5e6c9a48d168bb45b5e63292de040cd6a10870bbc31820a5"} Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.257193 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"57a97579-3281-4578-9f14-3429d802e69c","Type":"ContainerStarted","Data":"fbd48d3a74558ca1606596c78f30ad28f98d0032567df90785fc0d39b136299b"} Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.257253 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"57a97579-3281-4578-9f14-3429d802e69c","Type":"ContainerStarted","Data":"8d31590545b44ccd5610b207417eac5911838454dbc9d879adc057da8802a46f"} Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.260048 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c33cb8e9-0109-4f61-9441-c3a99463eaf4","Type":"ContainerStarted","Data":"7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf"} Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.265007 4766 generic.go:334] "Generic (PLEG): container finished" podID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerID="b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3" exitCode=0 Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.265160 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3794f4f9-a807-42c6-8bb0-f00fab8e7993","Type":"ContainerDied","Data":"b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3"} Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.265184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3794f4f9-a807-42c6-8bb0-f00fab8e7993","Type":"ContainerDied","Data":"02a9f96b0ebe14d9394bb7fd4515181f024082cd110c104e53d2eefbfabe4185"} Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.265232 4766 scope.go:117] "RemoveContainer" containerID="b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.265605 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.276979 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-combined-ca-bundle\") pod \"3c12d527-b20c-4b42-9106-6398d9f1681b\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.277072 4766 generic.go:334] "Generic (PLEG): container finished" podID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerID="5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270" exitCode=0 Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.277101 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c12d527-b20c-4b42-9106-6398d9f1681b-logs\") pod \"3c12d527-b20c-4b42-9106-6398d9f1681b\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.277125 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c12d527-b20c-4b42-9106-6398d9f1681b","Type":"ContainerDied","Data":"5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270"} Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.277156 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c12d527-b20c-4b42-9106-6398d9f1681b","Type":"ContainerDied","Data":"4dbb937d53ff41173ad98a79c7b5dc94caad84960ad41eb66c816bceaf17db69"} Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.277195 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-config-data\") pod \"3c12d527-b20c-4b42-9106-6398d9f1681b\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.277248 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.277288 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkwfp\" (UniqueName: \"kubernetes.io/projected/3c12d527-b20c-4b42-9106-6398d9f1681b-kube-api-access-rkwfp\") pod \"3c12d527-b20c-4b42-9106-6398d9f1681b\" (UID: \"3c12d527-b20c-4b42-9106-6398d9f1681b\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.277963 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c12d527-b20c-4b42-9106-6398d9f1681b-logs" (OuterVolumeSpecName: "logs") pod "3c12d527-b20c-4b42-9106-6398d9f1681b" (UID: "3c12d527-b20c-4b42-9106-6398d9f1681b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.284406 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c12d527-b20c-4b42-9106-6398d9f1681b-kube-api-access-rkwfp" (OuterVolumeSpecName: "kube-api-access-rkwfp") pod "3c12d527-b20c-4b42-9106-6398d9f1681b" (UID: "3c12d527-b20c-4b42-9106-6398d9f1681b"). InnerVolumeSpecName "kube-api-access-rkwfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.337823 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c12d527-b20c-4b42-9106-6398d9f1681b" (UID: "3c12d527-b20c-4b42-9106-6398d9f1681b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.340366 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-config-data" (OuterVolumeSpecName: "config-data") pod "3c12d527-b20c-4b42-9106-6398d9f1681b" (UID: "3c12d527-b20c-4b42-9106-6398d9f1681b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.348250 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.348229923 podStartE2EDuration="2.348229923s" podCreationTimestamp="2025-12-09 04:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:50:03.275652631 +0000 UTC m=+5884.984958077" watchObservedRunningTime="2025-12-09 04:50:03.348229923 +0000 UTC m=+5885.057535349" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.379705 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.379733 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkwfp\" (UniqueName: \"kubernetes.io/projected/3c12d527-b20c-4b42-9106-6398d9f1681b-kube-api-access-rkwfp\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.379743 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c12d527-b20c-4b42-9106-6398d9f1681b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.379761 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c12d527-b20c-4b42-9106-6398d9f1681b-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.424721 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.424698751 podStartE2EDuration="2.424698751s" podCreationTimestamp="2025-12-09 04:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:50:03.353039353 +0000 UTC m=+5885.062344789" watchObservedRunningTime="2025-12-09 04:50:03.424698751 +0000 UTC m=+5885.134004177" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.430257 4766 scope.go:117] "RemoveContainer" containerID="c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.442503 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.475266 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.483241 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-config-data\") pod \"463cd510-5202-403e-8f31-0e0fe1d54659\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.483407 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d5wz\" (UniqueName: \"kubernetes.io/projected/463cd510-5202-403e-8f31-0e0fe1d54659-kube-api-access-8d5wz\") pod \"463cd510-5202-403e-8f31-0e0fe1d54659\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.483440 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-combined-ca-bundle\") pod \"463cd510-5202-403e-8f31-0e0fe1d54659\" (UID: \"463cd510-5202-403e-8f31-0e0fe1d54659\") " Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.491282 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508075 4766 scope.go:117] "RemoveContainer" containerID="b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508174 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.508550 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463cd510-5202-403e-8f31-0e0fe1d54659" containerName="nova-cell0-conductor-conductor" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508563 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="463cd510-5202-403e-8f31-0e0fe1d54659" containerName="nova-cell0-conductor-conductor" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.508583 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-api" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508590 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-api" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.508613 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-metadata" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508619 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-metadata" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.508692 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-log" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508699 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-log" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.508706 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-log" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508712 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-log" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508890 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-metadata" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508900 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-api" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508913 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="463cd510-5202-403e-8f31-0e0fe1d54659" containerName="nova-cell0-conductor-conductor" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508924 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" containerName="nova-metadata-log" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.508931 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" containerName="nova-api-log" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.510404 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3\": container with ID starting with b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3 not found: ID does not exist" containerID="b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.510436 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3"} err="failed to get container status \"b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3\": rpc error: code = NotFound desc = could not find container \"b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3\": container with ID starting with b949865fd924e9fc025010533b8e733f0fd23d360b3b1b735db24f224a7659e3 not found: ID does not exist" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.510477 4766 scope.go:117] "RemoveContainer" containerID="c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.514428 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c\": container with ID starting with c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c not found: ID does not exist" containerID="c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.514499 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c"} err="failed to get container status \"c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c\": rpc error: code = NotFound desc = could not find container \"c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c\": container with ID starting with c0e66da6e20b07073c7edb8b3c17d31f1ecaa581e76256820d62ed0ee467cd6c not found: ID does not exist" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.514531 4766 scope.go:117] "RemoveContainer" containerID="5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.517281 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.517441 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.531362 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.532384 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.532550 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.533451 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463cd510-5202-403e-8f31-0e0fe1d54659-kube-api-access-8d5wz" (OuterVolumeSpecName: "kube-api-access-8d5wz") pod "463cd510-5202-403e-8f31-0e0fe1d54659" (UID: "463cd510-5202-403e-8f31-0e0fe1d54659"). InnerVolumeSpecName "kube-api-access-8d5wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.546457 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "463cd510-5202-403e-8f31-0e0fe1d54659" (UID: "463cd510-5202-403e-8f31-0e0fe1d54659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.565333 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.565399 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="60a1d753-d329-4f92-a520-aac278c6acd9" containerName="nova-cell1-conductor-conductor" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.572346 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-config-data" (OuterVolumeSpecName: "config-data") pod "463cd510-5202-403e-8f31-0e0fe1d54659" (UID: "463cd510-5202-403e-8f31-0e0fe1d54659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.589107 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.589184 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-config-data\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.589199 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stnmn\" (UniqueName: \"kubernetes.io/projected/63d5b763-8225-4cac-bdde-0002c06ed154-kube-api-access-stnmn\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.589231 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d5b763-8225-4cac-bdde-0002c06ed154-logs\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.589300 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.589311 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d5wz\" (UniqueName: \"kubernetes.io/projected/463cd510-5202-403e-8f31-0e0fe1d54659-kube-api-access-8d5wz\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.589320 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463cd510-5202-403e-8f31-0e0fe1d54659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.625375 4766 scope.go:117] "RemoveContainer" containerID="fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.668282 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.688164 4766 scope.go:117] "RemoveContainer" containerID="5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.690495 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270\": container with ID starting with 5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270 not found: ID does not exist" containerID="5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.690524 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270"} err="failed to get container status \"5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270\": rpc error: code = NotFound desc = could not find container \"5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270\": container with ID starting with 5aa4c124fb5bdbc674dd0a1d340fe5008990c4b1fe13b1b0089ecf7f56a78270 not found: ID does not exist" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.690546 4766 scope.go:117] "RemoveContainer" containerID="fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.691184 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.691271 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stnmn\" (UniqueName: \"kubernetes.io/projected/63d5b763-8225-4cac-bdde-0002c06ed154-kube-api-access-stnmn\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.691295 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-config-data\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.691314 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d5b763-8225-4cac-bdde-0002c06ed154-logs\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.691777 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d5b763-8225-4cac-bdde-0002c06ed154-logs\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: E1209 04:50:03.694544 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e\": container with ID starting with fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e not found: ID does not exist" containerID="fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.694594 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e"} err="failed to get container status \"fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e\": rpc error: code = NotFound desc = could not find container \"fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e\": container with ID starting with fd4c7cc7aad73d811f9a4c71e029a7216bfed0a31ec58fda53a8231c266b392e not found: ID does not exist" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.695446 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-config-data\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.695913 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.702795 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.712755 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stnmn\" (UniqueName: \"kubernetes.io/projected/63d5b763-8225-4cac-bdde-0002c06ed154-kube-api-access-stnmn\") pod \"nova-metadata-0\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " pod="openstack/nova-metadata-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.725383 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.727050 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.729322 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.746532 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.792945 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.793045 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7nnb\" (UniqueName: \"kubernetes.io/projected/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-kube-api-access-j7nnb\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.793085 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-config-data\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.793199 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-logs\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.895251 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7nnb\" (UniqueName: \"kubernetes.io/projected/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-kube-api-access-j7nnb\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.895357 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-config-data\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.895498 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-logs\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.895598 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.896661 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-logs\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.898990 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-config-data\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.899647 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.911082 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7nnb\" (UniqueName: \"kubernetes.io/projected/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-kube-api-access-j7nnb\") pod \"nova-api-0\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " pod="openstack/nova-api-0" Dec 09 04:50:03 crc kubenswrapper[4766]: I1209 04:50:03.941034 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.046954 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.295676 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"463cd510-5202-403e-8f31-0e0fe1d54659","Type":"ContainerDied","Data":"33bd41231f3d467007d01279abe1e2c1ff89134bc6745fc34243e2084cc5d46f"} Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.295933 4766 scope.go:117] "RemoveContainer" containerID="2803d5877db7f73f5e6c9a48d168bb45b5e63292de040cd6a10870bbc31820a5" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.296065 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.331168 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 04:50:04 crc kubenswrapper[4766]: W1209 04:50:04.347705 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30ee8f60_34a9_4f4f_8deb_324d8bfd2405.slice/crio-ca200d6b5334c4ccaa70734e29c9da3433ababf42cd99e6eafcc49b771a581d6 WatchSource:0}: Error finding container ca200d6b5334c4ccaa70734e29c9da3433ababf42cd99e6eafcc49b771a581d6: Status 404 returned error can't find the container with id ca200d6b5334c4ccaa70734e29c9da3433ababf42cd99e6eafcc49b771a581d6 Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.360847 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.377991 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.396329 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.398058 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.400179 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.404174 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.428579 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.523973 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbgjq\" (UniqueName: \"kubernetes.io/projected/0fd44249-b949-4f58-9973-01c7d3494dcc-kube-api-access-jbgjq\") pod \"nova-cell0-conductor-0\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.524137 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.524175 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.625982 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.626032 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.626134 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbgjq\" (UniqueName: \"kubernetes.io/projected/0fd44249-b949-4f58-9973-01c7d3494dcc-kube-api-access-jbgjq\") pod \"nova-cell0-conductor-0\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.631797 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.633753 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.649994 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbgjq\" (UniqueName: \"kubernetes.io/projected/0fd44249-b949-4f58-9973-01c7d3494dcc-kube-api-access-jbgjq\") pod \"nova-cell0-conductor-0\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.849672 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.849673 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3794f4f9-a807-42c6-8bb0-f00fab8e7993" path="/var/lib/kubelet/pods/3794f4f9-a807-42c6-8bb0-f00fab8e7993/volumes" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.850522 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c12d527-b20c-4b42-9106-6398d9f1681b" path="/var/lib/kubelet/pods/3c12d527-b20c-4b42-9106-6398d9f1681b/volumes" Dec 09 04:50:04 crc kubenswrapper[4766]: I1209 04:50:04.851145 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463cd510-5202-403e-8f31-0e0fe1d54659" path="/var/lib/kubelet/pods/463cd510-5202-403e-8f31-0e0fe1d54659/volumes" Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.309001 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 04:50:05 crc kubenswrapper[4766]: W1209 04:50:05.318094 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fd44249_b949_4f58_9973_01c7d3494dcc.slice/crio-40db9e28de0ca25e99fc5720b0f379e953b4401069ecd31420b52e2cfa7e5af1 WatchSource:0}: Error finding container 40db9e28de0ca25e99fc5720b0f379e953b4401069ecd31420b52e2cfa7e5af1: Status 404 returned error can't find the container with id 40db9e28de0ca25e99fc5720b0f379e953b4401069ecd31420b52e2cfa7e5af1 Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.319506 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63d5b763-8225-4cac-bdde-0002c06ed154","Type":"ContainerStarted","Data":"4cb91fd0afc5b488372a9a94bb30813d7d6976c95137e2089670348d94b43133"} Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.319543 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63d5b763-8225-4cac-bdde-0002c06ed154","Type":"ContainerStarted","Data":"26b07fb1176840683bbf295eaff4294fda9b24682a4f02345af89cf54f1d95b8"} Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.319601 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63d5b763-8225-4cac-bdde-0002c06ed154","Type":"ContainerStarted","Data":"f8dc682b4a3d2cbe26dd7963a2967f00d3d665120fd21a4ca7306d3f594f1a22"} Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.323461 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30ee8f60-34a9-4f4f-8deb-324d8bfd2405","Type":"ContainerStarted","Data":"3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554"} Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.323490 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30ee8f60-34a9-4f4f-8deb-324d8bfd2405","Type":"ContainerStarted","Data":"81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9"} Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.323500 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30ee8f60-34a9-4f4f-8deb-324d8bfd2405","Type":"ContainerStarted","Data":"ca200d6b5334c4ccaa70734e29c9da3433ababf42cd99e6eafcc49b771a581d6"} Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.339744 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.339724253 podStartE2EDuration="2.339724253s" podCreationTimestamp="2025-12-09 04:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:50:05.336473056 +0000 UTC m=+5887.045778492" watchObservedRunningTime="2025-12-09 04:50:05.339724253 +0000 UTC m=+5887.049029679" Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.358509 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.358488241 podStartE2EDuration="2.358488241s" podCreationTimestamp="2025-12-09 04:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:50:05.356829056 +0000 UTC m=+5887.066134492" watchObservedRunningTime="2025-12-09 04:50:05.358488241 +0000 UTC m=+5887.067793667" Dec 09 04:50:05 crc kubenswrapper[4766]: I1209 04:50:05.844305 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:50:05 crc kubenswrapper[4766]: E1209 04:50:05.845063 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:50:06 crc kubenswrapper[4766]: I1209 04:50:06.338454 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0fd44249-b949-4f58-9973-01c7d3494dcc","Type":"ContainerStarted","Data":"97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c"} Dec 09 04:50:06 crc kubenswrapper[4766]: I1209 04:50:06.338522 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0fd44249-b949-4f58-9973-01c7d3494dcc","Type":"ContainerStarted","Data":"40db9e28de0ca25e99fc5720b0f379e953b4401069ecd31420b52e2cfa7e5af1"} Dec 09 04:50:06 crc kubenswrapper[4766]: I1209 04:50:06.340909 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:06 crc kubenswrapper[4766]: I1209 04:50:06.370757 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.370733517 podStartE2EDuration="2.370733517s" podCreationTimestamp="2025-12-09 04:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:50:06.367684524 +0000 UTC m=+5888.076989950" watchObservedRunningTime="2025-12-09 04:50:06.370733517 +0000 UTC m=+5888.080038983" Dec 09 04:50:06 crc kubenswrapper[4766]: I1209 04:50:06.691976 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:06 crc kubenswrapper[4766]: I1209 04:50:06.702312 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 04:50:07 crc kubenswrapper[4766]: I1209 04:50:07.984318 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.100889 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-config-data\") pod \"60a1d753-d329-4f92-a520-aac278c6acd9\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.101027 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-combined-ca-bundle\") pod \"60a1d753-d329-4f92-a520-aac278c6acd9\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.101071 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t952h\" (UniqueName: \"kubernetes.io/projected/60a1d753-d329-4f92-a520-aac278c6acd9-kube-api-access-t952h\") pod \"60a1d753-d329-4f92-a520-aac278c6acd9\" (UID: \"60a1d753-d329-4f92-a520-aac278c6acd9\") " Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.110789 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a1d753-d329-4f92-a520-aac278c6acd9-kube-api-access-t952h" (OuterVolumeSpecName: "kube-api-access-t952h") pod "60a1d753-d329-4f92-a520-aac278c6acd9" (UID: "60a1d753-d329-4f92-a520-aac278c6acd9"). InnerVolumeSpecName "kube-api-access-t952h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.151536 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60a1d753-d329-4f92-a520-aac278c6acd9" (UID: "60a1d753-d329-4f92-a520-aac278c6acd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.152743 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-config-data" (OuterVolumeSpecName: "config-data") pod "60a1d753-d329-4f92-a520-aac278c6acd9" (UID: "60a1d753-d329-4f92-a520-aac278c6acd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.204056 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.204107 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t952h\" (UniqueName: \"kubernetes.io/projected/60a1d753-d329-4f92-a520-aac278c6acd9-kube-api-access-t952h\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.204128 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a1d753-d329-4f92-a520-aac278c6acd9-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.362402 4766 generic.go:334] "Generic (PLEG): container finished" podID="60a1d753-d329-4f92-a520-aac278c6acd9" containerID="0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff" exitCode=0 Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.362469 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60a1d753-d329-4f92-a520-aac278c6acd9","Type":"ContainerDied","Data":"0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff"} Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.362508 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"60a1d753-d329-4f92-a520-aac278c6acd9","Type":"ContainerDied","Data":"d9dcc239e0cbacd77f380825411f684f9026a3a6f6179c3b11dd77ec5cee9602"} Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.362537 4766 scope.go:117] "RemoveContainer" containerID="0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.362711 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.417643 4766 scope.go:117] "RemoveContainer" containerID="0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff" Dec 09 04:50:08 crc kubenswrapper[4766]: E1209 04:50:08.418253 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff\": container with ID starting with 0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff not found: ID does not exist" containerID="0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.418330 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff"} err="failed to get container status \"0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff\": rpc error: code = NotFound desc = could not find container \"0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff\": container with ID starting with 0b4b31954f7ad426febfd93981db8b487362284e83abf6752357247b49c826ff not found: ID does not exist" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.435265 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.455373 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.465770 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 04:50:08 crc kubenswrapper[4766]: E1209 04:50:08.466400 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a1d753-d329-4f92-a520-aac278c6acd9" containerName="nova-cell1-conductor-conductor" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.466426 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a1d753-d329-4f92-a520-aac278c6acd9" containerName="nova-cell1-conductor-conductor" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.466903 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a1d753-d329-4f92-a520-aac278c6acd9" containerName="nova-cell1-conductor-conductor" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.467988 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.470663 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.481406 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.515086 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.515537 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.515679 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4dsq\" (UniqueName: \"kubernetes.io/projected/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-kube-api-access-s4dsq\") pod \"nova-cell1-conductor-0\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.617137 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.617372 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.617430 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4dsq\" (UniqueName: \"kubernetes.io/projected/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-kube-api-access-s4dsq\") pod \"nova-cell1-conductor-0\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.623644 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.623853 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.639412 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4dsq\" (UniqueName: \"kubernetes.io/projected/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-kube-api-access-s4dsq\") pod \"nova-cell1-conductor-0\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.788913 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.859787 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60a1d753-d329-4f92-a520-aac278c6acd9" path="/var/lib/kubelet/pods/60a1d753-d329-4f92-a520-aac278c6acd9/volumes" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.942493 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 04:50:08 crc kubenswrapper[4766]: I1209 04:50:08.943400 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 04:50:09 crc kubenswrapper[4766]: I1209 04:50:09.292868 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 04:50:09 crc kubenswrapper[4766]: W1209 04:50:09.295167 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67b8b4b8_be77_4988_b2a6_5f2c5e21d874.slice/crio-1b16c70e6f9d83c9f51d7b9b327567d9c70c202cbb1d4799b849279a2eb5c468 WatchSource:0}: Error finding container 1b16c70e6f9d83c9f51d7b9b327567d9c70c202cbb1d4799b849279a2eb5c468: Status 404 returned error can't find the container with id 1b16c70e6f9d83c9f51d7b9b327567d9c70c202cbb1d4799b849279a2eb5c468 Dec 09 04:50:09 crc kubenswrapper[4766]: I1209 04:50:09.373764 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"67b8b4b8-be77-4988-b2a6-5f2c5e21d874","Type":"ContainerStarted","Data":"1b16c70e6f9d83c9f51d7b9b327567d9c70c202cbb1d4799b849279a2eb5c468"} Dec 09 04:50:10 crc kubenswrapper[4766]: I1209 04:50:10.387791 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"67b8b4b8-be77-4988-b2a6-5f2c5e21d874","Type":"ContainerStarted","Data":"46bac6e0959b2e7c1cb812308bc7f3c437527d4b8e8da8360281f1fa3c9e6784"} Dec 09 04:50:10 crc kubenswrapper[4766]: I1209 04:50:10.388170 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:10 crc kubenswrapper[4766]: I1209 04:50:10.422389 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.422363453 podStartE2EDuration="2.422363453s" podCreationTimestamp="2025-12-09 04:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:50:10.404750296 +0000 UTC m=+5892.114055742" watchObservedRunningTime="2025-12-09 04:50:10.422363453 +0000 UTC m=+5892.131668919" Dec 09 04:50:11 crc kubenswrapper[4766]: I1209 04:50:11.692523 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:11 crc kubenswrapper[4766]: I1209 04:50:11.701766 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 04:50:11 crc kubenswrapper[4766]: I1209 04:50:11.702090 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:11 crc kubenswrapper[4766]: I1209 04:50:11.749187 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 04:50:12 crc kubenswrapper[4766]: I1209 04:50:12.420874 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 09 04:50:12 crc kubenswrapper[4766]: I1209 04:50:12.434131 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 04:50:13 crc kubenswrapper[4766]: I1209 04:50:13.942858 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 04:50:13 crc kubenswrapper[4766]: I1209 04:50:13.943179 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 04:50:14 crc kubenswrapper[4766]: I1209 04:50:14.048027 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 04:50:14 crc kubenswrapper[4766]: I1209 04:50:14.048076 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 04:50:14 crc kubenswrapper[4766]: I1209 04:50:14.886580 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 09 04:50:15 crc kubenswrapper[4766]: I1209 04:50:15.024439 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:50:15 crc kubenswrapper[4766]: I1209 04:50:15.024449 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:50:15 crc kubenswrapper[4766]: I1209 04:50:15.131418 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:50:15 crc kubenswrapper[4766]: I1209 04:50:15.131662 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:50:18 crc kubenswrapper[4766]: I1209 04:50:18.858373 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.267088 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.268973 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.271227 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.277440 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.346829 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-scripts\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.346906 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.346979 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6njtf\" (UniqueName: \"kubernetes.io/projected/a5a2be4e-a17f-4009-aaca-7bb5953d836a-kube-api-access-6njtf\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.347030 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.347086 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.347112 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5a2be4e-a17f-4009-aaca-7bb5953d836a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.448123 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.448173 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.448190 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5a2be4e-a17f-4009-aaca-7bb5953d836a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.448239 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-scripts\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.448283 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.448332 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6njtf\" (UniqueName: \"kubernetes.io/projected/a5a2be4e-a17f-4009-aaca-7bb5953d836a-kube-api-access-6njtf\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.449206 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5a2be4e-a17f-4009-aaca-7bb5953d836a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.454246 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.457593 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.457628 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.465117 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-scripts\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.465509 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6njtf\" (UniqueName: \"kubernetes.io/projected/a5a2be4e-a17f-4009-aaca-7bb5953d836a-kube-api-access-6njtf\") pod \"cinder-scheduler-0\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:19 crc kubenswrapper[4766]: I1209 04:50:19.586505 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 04:50:20 crc kubenswrapper[4766]: I1209 04:50:20.062661 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 04:50:20 crc kubenswrapper[4766]: I1209 04:50:20.489058 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a5a2be4e-a17f-4009-aaca-7bb5953d836a","Type":"ContainerStarted","Data":"7b45df46e92b61beb79f8f6d0dec2cc57a0cf111df1444c2b64d90737e3522ee"} Dec 09 04:50:20 crc kubenswrapper[4766]: I1209 04:50:20.840121 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:50:20 crc kubenswrapper[4766]: E1209 04:50:20.840399 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:50:20 crc kubenswrapper[4766]: I1209 04:50:20.850729 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 04:50:20 crc kubenswrapper[4766]: I1209 04:50:20.850941 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerName="cinder-api-log" containerID="cri-o://93caa0dbdd1b93e2049b64a62e671d41b6260dc67472fbba16bbc7ed01b0a01c" gracePeriod=30 Dec 09 04:50:20 crc kubenswrapper[4766]: I1209 04:50:20.851236 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerName="cinder-api" containerID="cri-o://0635a3e8513c85c5c7c986500448a30ac62cbbbb8f1c8a089c82b85256423e18" gracePeriod=30 Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.397442 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.426391 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.437334 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.454378 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.507113 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a5a2be4e-a17f-4009-aaca-7bb5953d836a","Type":"ContainerStarted","Data":"3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470"} Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.507164 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a5a2be4e-a17f-4009-aaca-7bb5953d836a","Type":"ContainerStarted","Data":"1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74"} Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.515938 4766 generic.go:334] "Generic (PLEG): container finished" podID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerID="93caa0dbdd1b93e2049b64a62e671d41b6260dc67472fbba16bbc7ed01b0a01c" exitCode=143 Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.515991 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a15b1fac-7acc-411c-8f77-e80e15ca27ec","Type":"ContainerDied","Data":"93caa0dbdd1b93e2049b64a62e671d41b6260dc67472fbba16bbc7ed01b0a01c"} Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.524323 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.524306464 podStartE2EDuration="2.524306464s" podCreationTimestamp="2025-12-09 04:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:50:21.522640629 +0000 UTC m=+5903.231946065" watchObservedRunningTime="2025-12-09 04:50:21.524306464 +0000 UTC m=+5903.233611880" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.526594 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.527727 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.527882 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsc5b\" (UniqueName: \"kubernetes.io/projected/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-kube-api-access-lsc5b\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.528135 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.528290 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.528442 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.528582 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-run\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.528675 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.528769 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.528900 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.530045 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.530138 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.530255 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.530348 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.530455 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.530582 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632272 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632688 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632748 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632790 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-run\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632818 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632852 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632906 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632937 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632969 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.633006 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.633040 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.633081 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.633127 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.633193 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.633257 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.633302 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsc5b\" (UniqueName: \"kubernetes.io/projected/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-kube-api-access-lsc5b\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.632429 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-dev\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.633877 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.634036 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.634088 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-run\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.634472 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.634731 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.635078 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-sys\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.635265 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.635364 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.635429 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.643378 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.643468 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.643813 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.647075 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.653639 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.660750 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsc5b\" (UniqueName: \"kubernetes.io/projected/e5cb8425-9990-4aa8-8d37-c6276ef64bb4-kube-api-access-lsc5b\") pod \"cinder-volume-volume1-0\" (UID: \"e5cb8425-9990-4aa8-8d37-c6276ef64bb4\") " pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:21 crc kubenswrapper[4766]: I1209 04:50:21.783145 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.381111 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.397885 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.529126 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e5cb8425-9990-4aa8-8d37-c6276ef64bb4","Type":"ContainerStarted","Data":"415e3d9b857aadaf406cff1ebe0c0f5329c44b8d662b5ade77fd7b2f164fb9b3"} Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.702915 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.704512 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.714764 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.724923 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757027 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/89875ded-a9cb-4747-805a-15d7720291f6-ceph\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757099 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-scripts\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757130 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757167 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757202 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757240 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-run\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757270 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-dev\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757300 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757333 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-config-data\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757473 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757531 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbfjs\" (UniqueName: \"kubernetes.io/projected/89875ded-a9cb-4747-805a-15d7720291f6-kube-api-access-zbfjs\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757561 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757602 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-sys\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757631 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757658 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-lib-modules\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.757696 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.859303 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-sys\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.859363 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.859386 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-lib-modules\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.859383 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-sys\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.859447 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-lib-modules\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.859472 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.859895 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.859928 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/89875ded-a9cb-4747-805a-15d7720291f6-ceph\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.859969 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-scripts\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860061 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860069 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860131 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860171 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860187 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-run\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860248 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-dev\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860286 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860323 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-config-data\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860377 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860429 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbfjs\" (UniqueName: \"kubernetes.io/projected/89875ded-a9cb-4747-805a-15d7720291f6-kube-api-access-zbfjs\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860450 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.860986 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.861337 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.861381 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-run\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.861416 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.861772 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.862113 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/89875ded-a9cb-4747-805a-15d7720291f6-dev\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.870325 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-scripts\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.870694 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.871628 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/89875ded-a9cb-4747-805a-15d7720291f6-ceph\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.872202 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.890506 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbfjs\" (UniqueName: \"kubernetes.io/projected/89875ded-a9cb-4747-805a-15d7720291f6-kube-api-access-zbfjs\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:22 crc kubenswrapper[4766]: I1209 04:50:22.890989 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89875ded-a9cb-4747-805a-15d7720291f6-config-data\") pod \"cinder-backup-0\" (UID: \"89875ded-a9cb-4747-805a-15d7720291f6\") " pod="openstack/cinder-backup-0" Dec 09 04:50:23 crc kubenswrapper[4766]: I1209 04:50:23.033557 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 09 04:50:23 crc kubenswrapper[4766]: I1209 04:50:23.541303 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e5cb8425-9990-4aa8-8d37-c6276ef64bb4","Type":"ContainerStarted","Data":"1217a38fe5b8e5414809cb6824563db27eafe2e864f2c742d124f57a7bb3fcf2"} Dec 09 04:50:23 crc kubenswrapper[4766]: I1209 04:50:23.541614 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"e5cb8425-9990-4aa8-8d37-c6276ef64bb4","Type":"ContainerStarted","Data":"0be6e4404cc890c828f4646df7980889b885c2b51bc1c7936b20a366c4c0df44"} Dec 09 04:50:23 crc kubenswrapper[4766]: I1209 04:50:23.560751 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=1.805577483 podStartE2EDuration="2.560731849s" podCreationTimestamp="2025-12-09 04:50:21 +0000 UTC" firstStartedPulling="2025-12-09 04:50:22.380828799 +0000 UTC m=+5904.090134235" lastFinishedPulling="2025-12-09 04:50:23.135983175 +0000 UTC m=+5904.845288601" observedRunningTime="2025-12-09 04:50:23.559389382 +0000 UTC m=+5905.268694828" watchObservedRunningTime="2025-12-09 04:50:23.560731849 +0000 UTC m=+5905.270037275" Dec 09 04:50:23 crc kubenswrapper[4766]: I1209 04:50:23.642708 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 09 04:50:23 crc kubenswrapper[4766]: I1209 04:50:23.944309 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 04:50:23 crc kubenswrapper[4766]: I1209 04:50:23.945758 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 04:50:23 crc kubenswrapper[4766]: I1209 04:50:23.948650 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.052038 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.052479 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.057150 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.076679 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.317400 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.78:8776/healthcheck\": read tcp 10.217.0.2:51860->10.217.1.78:8776: read: connection reset by peer" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.557256 4766 generic.go:334] "Generic (PLEG): container finished" podID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerID="0635a3e8513c85c5c7c986500448a30ac62cbbbb8f1c8a089c82b85256423e18" exitCode=0 Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.557336 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a15b1fac-7acc-411c-8f77-e80e15ca27ec","Type":"ContainerDied","Data":"0635a3e8513c85c5c7c986500448a30ac62cbbbb8f1c8a089c82b85256423e18"} Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.564713 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"89875ded-a9cb-4747-805a-15d7720291f6","Type":"ContainerStarted","Data":"18dcbc4b01a993a3580e556d2dbbb316e3c62a8ff0b0ca9bc18b2b1cba6ab6e0"} Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.564763 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.567790 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.569273 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.587044 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 04:50:24 crc kubenswrapper[4766]: I1209 04:50:24.849517 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.017672 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a15b1fac-7acc-411c-8f77-e80e15ca27ec-etc-machine-id\") pod \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.017732 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a15b1fac-7acc-411c-8f77-e80e15ca27ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a15b1fac-7acc-411c-8f77-e80e15ca27ec" (UID: "a15b1fac-7acc-411c-8f77-e80e15ca27ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.017794 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b1fac-7acc-411c-8f77-e80e15ca27ec-logs\") pod \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.017863 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crxjp\" (UniqueName: \"kubernetes.io/projected/a15b1fac-7acc-411c-8f77-e80e15ca27ec-kube-api-access-crxjp\") pod \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.017991 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data\") pod \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.018055 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-combined-ca-bundle\") pod \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.018123 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data-custom\") pod \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.018150 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-scripts\") pod \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\" (UID: \"a15b1fac-7acc-411c-8f77-e80e15ca27ec\") " Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.018723 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a15b1fac-7acc-411c-8f77-e80e15ca27ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.019445 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15b1fac-7acc-411c-8f77-e80e15ca27ec-logs" (OuterVolumeSpecName: "logs") pod "a15b1fac-7acc-411c-8f77-e80e15ca27ec" (UID: "a15b1fac-7acc-411c-8f77-e80e15ca27ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.023835 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-scripts" (OuterVolumeSpecName: "scripts") pod "a15b1fac-7acc-411c-8f77-e80e15ca27ec" (UID: "a15b1fac-7acc-411c-8f77-e80e15ca27ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.026649 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15b1fac-7acc-411c-8f77-e80e15ca27ec-kube-api-access-crxjp" (OuterVolumeSpecName: "kube-api-access-crxjp") pod "a15b1fac-7acc-411c-8f77-e80e15ca27ec" (UID: "a15b1fac-7acc-411c-8f77-e80e15ca27ec"). InnerVolumeSpecName "kube-api-access-crxjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.043349 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a15b1fac-7acc-411c-8f77-e80e15ca27ec" (UID: "a15b1fac-7acc-411c-8f77-e80e15ca27ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.054962 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a15b1fac-7acc-411c-8f77-e80e15ca27ec" (UID: "a15b1fac-7acc-411c-8f77-e80e15ca27ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.115477 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data" (OuterVolumeSpecName: "config-data") pod "a15b1fac-7acc-411c-8f77-e80e15ca27ec" (UID: "a15b1fac-7acc-411c-8f77-e80e15ca27ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.121661 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crxjp\" (UniqueName: \"kubernetes.io/projected/a15b1fac-7acc-411c-8f77-e80e15ca27ec-kube-api-access-crxjp\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.121706 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.121719 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.121735 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.121747 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b1fac-7acc-411c-8f77-e80e15ca27ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.121758 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b1fac-7acc-411c-8f77-e80e15ca27ec-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.573824 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a15b1fac-7acc-411c-8f77-e80e15ca27ec","Type":"ContainerDied","Data":"6fcd708c5f6050709d7e78251ae8fd329d9f3e08332926e7916d779fc3fa294f"} Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.574061 4766 scope.go:117] "RemoveContainer" containerID="0635a3e8513c85c5c7c986500448a30ac62cbbbb8f1c8a089c82b85256423e18" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.574164 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.584120 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"89875ded-a9cb-4747-805a-15d7720291f6","Type":"ContainerStarted","Data":"799c4051feb2a497fdf2a2e6b0e8ae8790a798086bb503ad8dcd5d1ecdaf7342"} Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.584154 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"89875ded-a9cb-4747-805a-15d7720291f6","Type":"ContainerStarted","Data":"7cb5f98865f35982415ccc4b4789e6029734b2bc84a30d9af35353c079aa67cd"} Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.611075 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.899962994 podStartE2EDuration="3.611058508s" podCreationTimestamp="2025-12-09 04:50:22 +0000 UTC" firstStartedPulling="2025-12-09 04:50:23.642244042 +0000 UTC m=+5905.351549468" lastFinishedPulling="2025-12-09 04:50:24.353339556 +0000 UTC m=+5906.062644982" observedRunningTime="2025-12-09 04:50:25.606145846 +0000 UTC m=+5907.315451292" watchObservedRunningTime="2025-12-09 04:50:25.611058508 +0000 UTC m=+5907.320363934" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.615806 4766 scope.go:117] "RemoveContainer" containerID="93caa0dbdd1b93e2049b64a62e671d41b6260dc67472fbba16bbc7ed01b0a01c" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.651383 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.666323 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.676694 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 09 04:50:25 crc kubenswrapper[4766]: E1209 04:50:25.677157 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerName="cinder-api-log" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.677174 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerName="cinder-api-log" Dec 09 04:50:25 crc kubenswrapper[4766]: E1209 04:50:25.677184 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerName="cinder-api" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.677191 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerName="cinder-api" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.677382 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerName="cinder-api" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.677399 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" containerName="cinder-api-log" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.678360 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.680607 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.687007 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.833199 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-config-data\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.833271 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49wnm\" (UniqueName: \"kubernetes.io/projected/5057c712-9d00-4dcf-80ba-f67a171b0828-kube-api-access-49wnm\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.833302 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5057c712-9d00-4dcf-80ba-f67a171b0828-logs\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.833348 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-config-data-custom\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.833620 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.833660 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5057c712-9d00-4dcf-80ba-f67a171b0828-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.833734 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-scripts\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.936415 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5057c712-9d00-4dcf-80ba-f67a171b0828-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.936517 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-scripts\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.936544 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5057c712-9d00-4dcf-80ba-f67a171b0828-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.936596 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-config-data\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.936643 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49wnm\" (UniqueName: \"kubernetes.io/projected/5057c712-9d00-4dcf-80ba-f67a171b0828-kube-api-access-49wnm\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.936668 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5057c712-9d00-4dcf-80ba-f67a171b0828-logs\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.936714 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-config-data-custom\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.936747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.937275 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5057c712-9d00-4dcf-80ba-f67a171b0828-logs\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.946161 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-config-data-custom\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.946873 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-config-data\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.949752 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-scripts\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.954257 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49wnm\" (UniqueName: \"kubernetes.io/projected/5057c712-9d00-4dcf-80ba-f67a171b0828-kube-api-access-49wnm\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:25 crc kubenswrapper[4766]: I1209 04:50:25.956947 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5057c712-9d00-4dcf-80ba-f67a171b0828-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5057c712-9d00-4dcf-80ba-f67a171b0828\") " pod="openstack/cinder-api-0" Dec 09 04:50:26 crc kubenswrapper[4766]: I1209 04:50:26.000772 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 09 04:50:26 crc kubenswrapper[4766]: I1209 04:50:26.488528 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 09 04:50:26 crc kubenswrapper[4766]: I1209 04:50:26.597622 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5057c712-9d00-4dcf-80ba-f67a171b0828","Type":"ContainerStarted","Data":"6503d656308c0a50feebc320c8e0f49a6df8bd9dec264e05f1474d554e68fbc3"} Dec 09 04:50:26 crc kubenswrapper[4766]: I1209 04:50:26.784054 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:26 crc kubenswrapper[4766]: I1209 04:50:26.854285 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15b1fac-7acc-411c-8f77-e80e15ca27ec" path="/var/lib/kubelet/pods/a15b1fac-7acc-411c-8f77-e80e15ca27ec/volumes" Dec 09 04:50:27 crc kubenswrapper[4766]: I1209 04:50:27.612085 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5057c712-9d00-4dcf-80ba-f67a171b0828","Type":"ContainerStarted","Data":"f975d9e62023bcf9be1eee1c974423a5a4e4bf4425e51c9c7f9ded715a058612"} Dec 09 04:50:28 crc kubenswrapper[4766]: I1209 04:50:28.034690 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 09 04:50:28 crc kubenswrapper[4766]: I1209 04:50:28.630573 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5057c712-9d00-4dcf-80ba-f67a171b0828","Type":"ContainerStarted","Data":"6e60672496a10551ae9658cbc74dadc32e327020b9217d50ebd7553b900c75eb"} Dec 09 04:50:28 crc kubenswrapper[4766]: I1209 04:50:28.630833 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 09 04:50:28 crc kubenswrapper[4766]: I1209 04:50:28.650361 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.650343926 podStartE2EDuration="3.650343926s" podCreationTimestamp="2025-12-09 04:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:50:28.648349562 +0000 UTC m=+5910.357654988" watchObservedRunningTime="2025-12-09 04:50:28.650343926 +0000 UTC m=+5910.359649352" Dec 09 04:50:29 crc kubenswrapper[4766]: I1209 04:50:29.855329 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 04:50:29 crc kubenswrapper[4766]: I1209 04:50:29.915463 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 04:50:30 crc kubenswrapper[4766]: I1209 04:50:30.658593 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerName="cinder-scheduler" containerID="cri-o://1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74" gracePeriod=30 Dec 09 04:50:30 crc kubenswrapper[4766]: I1209 04:50:30.658695 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerName="probe" containerID="cri-o://3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470" gracePeriod=30 Dec 09 04:50:31 crc kubenswrapper[4766]: I1209 04:50:31.668266 4766 generic.go:334] "Generic (PLEG): container finished" podID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerID="3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470" exitCode=0 Dec 09 04:50:31 crc kubenswrapper[4766]: I1209 04:50:31.668312 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a5a2be4e-a17f-4009-aaca-7bb5953d836a","Type":"ContainerDied","Data":"3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470"} Dec 09 04:50:31 crc kubenswrapper[4766]: I1209 04:50:31.839383 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:50:31 crc kubenswrapper[4766]: E1209 04:50:31.840173 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:50:31 crc kubenswrapper[4766]: I1209 04:50:31.963615 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.380617 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.465076 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6njtf\" (UniqueName: \"kubernetes.io/projected/a5a2be4e-a17f-4009-aaca-7bb5953d836a-kube-api-access-6njtf\") pod \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.465400 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data-custom\") pod \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.465439 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data\") pod \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.465458 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5a2be4e-a17f-4009-aaca-7bb5953d836a-etc-machine-id\") pod \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.465523 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-scripts\") pod \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.465594 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-combined-ca-bundle\") pod \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\" (UID: \"a5a2be4e-a17f-4009-aaca-7bb5953d836a\") " Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.467230 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5a2be4e-a17f-4009-aaca-7bb5953d836a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a5a2be4e-a17f-4009-aaca-7bb5953d836a" (UID: "a5a2be4e-a17f-4009-aaca-7bb5953d836a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.471882 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a5a2be4e-a17f-4009-aaca-7bb5953d836a" (UID: "a5a2be4e-a17f-4009-aaca-7bb5953d836a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.486640 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-scripts" (OuterVolumeSpecName: "scripts") pod "a5a2be4e-a17f-4009-aaca-7bb5953d836a" (UID: "a5a2be4e-a17f-4009-aaca-7bb5953d836a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.487819 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a2be4e-a17f-4009-aaca-7bb5953d836a-kube-api-access-6njtf" (OuterVolumeSpecName: "kube-api-access-6njtf") pod "a5a2be4e-a17f-4009-aaca-7bb5953d836a" (UID: "a5a2be4e-a17f-4009-aaca-7bb5953d836a"). InnerVolumeSpecName "kube-api-access-6njtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.521716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5a2be4e-a17f-4009-aaca-7bb5953d836a" (UID: "a5a2be4e-a17f-4009-aaca-7bb5953d836a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.563012 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data" (OuterVolumeSpecName: "config-data") pod "a5a2be4e-a17f-4009-aaca-7bb5953d836a" (UID: "a5a2be4e-a17f-4009-aaca-7bb5953d836a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.568110 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6njtf\" (UniqueName: \"kubernetes.io/projected/a5a2be4e-a17f-4009-aaca-7bb5953d836a-kube-api-access-6njtf\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.568139 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.568148 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.568157 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5a2be4e-a17f-4009-aaca-7bb5953d836a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.568166 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.568174 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5a2be4e-a17f-4009-aaca-7bb5953d836a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.678737 4766 generic.go:334] "Generic (PLEG): container finished" podID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerID="1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74" exitCode=0 Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.678786 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a5a2be4e-a17f-4009-aaca-7bb5953d836a","Type":"ContainerDied","Data":"1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74"} Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.678811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a5a2be4e-a17f-4009-aaca-7bb5953d836a","Type":"ContainerDied","Data":"7b45df46e92b61beb79f8f6d0dec2cc57a0cf111df1444c2b64d90737e3522ee"} Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.678829 4766 scope.go:117] "RemoveContainer" containerID="3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.678995 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.710225 4766 scope.go:117] "RemoveContainer" containerID="1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.719575 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.728442 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.744027 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 04:50:32 crc kubenswrapper[4766]: E1209 04:50:32.744438 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerName="cinder-scheduler" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.744458 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerName="cinder-scheduler" Dec 09 04:50:32 crc kubenswrapper[4766]: E1209 04:50:32.744491 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerName="probe" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.745507 4766 scope.go:117] "RemoveContainer" containerID="3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470" Dec 09 04:50:32 crc kubenswrapper[4766]: E1209 04:50:32.745963 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470\": container with ID starting with 3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470 not found: ID does not exist" containerID="3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.746014 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470"} err="failed to get container status \"3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470\": rpc error: code = NotFound desc = could not find container \"3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470\": container with ID starting with 3af5c0e8ac330e1f6ee9cdabb73193795cc8627ffff221ea16ff58c1a585f470 not found: ID does not exist" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.746043 4766 scope.go:117] "RemoveContainer" containerID="1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.744498 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerName="probe" Dec 09 04:50:32 crc kubenswrapper[4766]: E1209 04:50:32.746762 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74\": container with ID starting with 1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74 not found: ID does not exist" containerID="1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.746797 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74"} err="failed to get container status \"1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74\": rpc error: code = NotFound desc = could not find container \"1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74\": container with ID starting with 1cc861a135cfa1b7b6de51a7aa85e63bc93867da47390ae9ce06b4d06365cb74 not found: ID does not exist" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.746874 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerName="cinder-scheduler" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.746904 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" containerName="probe" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.747911 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.750183 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.760285 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.857323 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a2be4e-a17f-4009-aaca-7bb5953d836a" path="/var/lib/kubelet/pods/a5a2be4e-a17f-4009-aaca-7bb5953d836a/volumes" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.873033 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-scripts\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.873118 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.873262 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-config-data\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.873288 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2b2de8b-3edd-4831-8c78-df96f83cceef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.873313 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.873344 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnrqw\" (UniqueName: \"kubernetes.io/projected/d2b2de8b-3edd-4831-8c78-df96f83cceef-kube-api-access-qnrqw\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.974865 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-scripts\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.974952 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.975037 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-config-data\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.975065 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2b2de8b-3edd-4831-8c78-df96f83cceef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.975095 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.975130 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnrqw\" (UniqueName: \"kubernetes.io/projected/d2b2de8b-3edd-4831-8c78-df96f83cceef-kube-api-access-qnrqw\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.975345 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2b2de8b-3edd-4831-8c78-df96f83cceef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.978971 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-scripts\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.978980 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.979098 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.980248 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b2de8b-3edd-4831-8c78-df96f83cceef-config-data\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:32 crc kubenswrapper[4766]: I1209 04:50:32.998491 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnrqw\" (UniqueName: \"kubernetes.io/projected/d2b2de8b-3edd-4831-8c78-df96f83cceef-kube-api-access-qnrqw\") pod \"cinder-scheduler-0\" (UID: \"d2b2de8b-3edd-4831-8c78-df96f83cceef\") " pod="openstack/cinder-scheduler-0" Dec 09 04:50:33 crc kubenswrapper[4766]: I1209 04:50:33.071623 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 09 04:50:33 crc kubenswrapper[4766]: I1209 04:50:33.269026 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 09 04:50:33 crc kubenswrapper[4766]: I1209 04:50:33.552673 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 09 04:50:33 crc kubenswrapper[4766]: I1209 04:50:33.702271 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d2b2de8b-3edd-4831-8c78-df96f83cceef","Type":"ContainerStarted","Data":"8df13e46d36608646aec62bd61d767ff989a267cfd1ffd7f512dff5e45328e3a"} Dec 09 04:50:34 crc kubenswrapper[4766]: I1209 04:50:34.712331 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d2b2de8b-3edd-4831-8c78-df96f83cceef","Type":"ContainerStarted","Data":"b802c056bc4e1d2ecdfe19107b9eb8ddf8642a29e2ae4657ec737880206a7adb"} Dec 09 04:50:35 crc kubenswrapper[4766]: I1209 04:50:35.723886 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d2b2de8b-3edd-4831-8c78-df96f83cceef","Type":"ContainerStarted","Data":"d90cc7eaa1221ade0adb555ecc5dda6a4b53ff8c6a6e1da71d7cddd35f0a8da2"} Dec 09 04:50:35 crc kubenswrapper[4766]: I1209 04:50:35.805977 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.8059544880000002 podStartE2EDuration="3.805954488s" podCreationTimestamp="2025-12-09 04:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:50:35.7912448 +0000 UTC m=+5917.500550246" watchObservedRunningTime="2025-12-09 04:50:35.805954488 +0000 UTC m=+5917.515259924" Dec 09 04:50:37 crc kubenswrapper[4766]: I1209 04:50:37.763570 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 09 04:50:38 crc kubenswrapper[4766]: I1209 04:50:38.072805 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 09 04:50:42 crc kubenswrapper[4766]: I1209 04:50:42.839979 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:50:42 crc kubenswrapper[4766]: E1209 04:50:42.841285 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:50:43 crc kubenswrapper[4766]: I1209 04:50:43.309509 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 09 04:50:55 crc kubenswrapper[4766]: I1209 04:50:55.839441 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:50:55 crc kubenswrapper[4766]: E1209 04:50:55.840590 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:51:08 crc kubenswrapper[4766]: I1209 04:51:08.839531 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:51:09 crc kubenswrapper[4766]: I1209 04:51:09.152417 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"0fb3a256512499ae16934c9ba9ee1a59382e5cd9e65f810fb30d76fbdefbf701"} Dec 09 04:52:14 crc kubenswrapper[4766]: I1209 04:52:14.881639 4766 scope.go:117] "RemoveContainer" containerID="09134fe6ebc545ebe7f58915dae51d4711f8f66cfd567f96bfb070631a44293c" Dec 09 04:52:14 crc kubenswrapper[4766]: I1209 04:52:14.927593 4766 scope.go:117] "RemoveContainer" containerID="471a4906e9207607dfb3dbbd1e235a3ab91b87c2c6093aa97d8642f3dc8a6631" Dec 09 04:52:14 crc kubenswrapper[4766]: I1209 04:52:14.970556 4766 scope.go:117] "RemoveContainer" containerID="f577513496fc8292f579964a07b701d2fc0eff18885297bcfe522d0f2edbfddb" Dec 09 04:52:25 crc kubenswrapper[4766]: I1209 04:52:25.082367 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-s5krb"] Dec 09 04:52:25 crc kubenswrapper[4766]: I1209 04:52:25.091662 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-564a-account-create-update-92zz4"] Dec 09 04:52:25 crc kubenswrapper[4766]: I1209 04:52:25.107042 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-564a-account-create-update-92zz4"] Dec 09 04:52:25 crc kubenswrapper[4766]: I1209 04:52:25.117471 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-s5krb"] Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.860732 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1ebbf5-ae48-47ad-a185-ae686bcc2839" path="/var/lib/kubelet/pods/1b1ebbf5-ae48-47ad-a185-ae686bcc2839/volumes" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.861922 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6190f627-a269-4c61-8cd6-da76dae23866" path="/var/lib/kubelet/pods/6190f627-a269-4c61-8cd6-da76dae23866/volumes" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.869998 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-94xfz"] Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.872960 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zx2fg"] Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.879108 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-94xfz" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.885053 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hvxkk" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.885394 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.904534 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-94xfz"] Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.904664 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.925247 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zx2fg"] Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.973542 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7415e725-513c-4557-8d04-62a9c452ec6c-var-run-ovn\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.973898 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0572d99b-c46f-4734-bf6f-72c01b8a8e84-scripts\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.973917 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-var-run\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.973956 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-etc-ovs\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.973974 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7415e725-513c-4557-8d04-62a9c452ec6c-scripts\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.973995 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-var-lib\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.974012 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7415e725-513c-4557-8d04-62a9c452ec6c-var-run\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.974055 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4vb8\" (UniqueName: \"kubernetes.io/projected/0572d99b-c46f-4734-bf6f-72c01b8a8e84-kube-api-access-r4vb8\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.974168 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwh7g\" (UniqueName: \"kubernetes.io/projected/7415e725-513c-4557-8d04-62a9c452ec6c-kube-api-access-rwh7g\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.974274 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7415e725-513c-4557-8d04-62a9c452ec6c-var-log-ovn\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:26 crc kubenswrapper[4766]: I1209 04:52:26.974471 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-var-log\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076367 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7415e725-513c-4557-8d04-62a9c452ec6c-var-run-ovn\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076437 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0572d99b-c46f-4734-bf6f-72c01b8a8e84-scripts\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076460 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-var-run\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076499 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-etc-ovs\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076517 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7415e725-513c-4557-8d04-62a9c452ec6c-scripts\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076536 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-var-lib\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7415e725-513c-4557-8d04-62a9c452ec6c-var-run\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076576 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4vb8\" (UniqueName: \"kubernetes.io/projected/0572d99b-c46f-4734-bf6f-72c01b8a8e84-kube-api-access-r4vb8\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076597 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwh7g\" (UniqueName: \"kubernetes.io/projected/7415e725-513c-4557-8d04-62a9c452ec6c-kube-api-access-rwh7g\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076614 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7415e725-513c-4557-8d04-62a9c452ec6c-var-log-ovn\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076662 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-var-log\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076762 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7415e725-513c-4557-8d04-62a9c452ec6c-var-run-ovn\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076789 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-var-log\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076823 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7415e725-513c-4557-8d04-62a9c452ec6c-var-run\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-etc-ovs\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.076767 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-var-run\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.077078 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0572d99b-c46f-4734-bf6f-72c01b8a8e84-var-lib\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.077140 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7415e725-513c-4557-8d04-62a9c452ec6c-var-log-ovn\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.079052 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7415e725-513c-4557-8d04-62a9c452ec6c-scripts\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.079945 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0572d99b-c46f-4734-bf6f-72c01b8a8e84-scripts\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.102432 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4vb8\" (UniqueName: \"kubernetes.io/projected/0572d99b-c46f-4734-bf6f-72c01b8a8e84-kube-api-access-r4vb8\") pod \"ovn-controller-ovs-zx2fg\" (UID: \"0572d99b-c46f-4734-bf6f-72c01b8a8e84\") " pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.106901 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwh7g\" (UniqueName: \"kubernetes.io/projected/7415e725-513c-4557-8d04-62a9c452ec6c-kube-api-access-rwh7g\") pod \"ovn-controller-94xfz\" (UID: \"7415e725-513c-4557-8d04-62a9c452ec6c\") " pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.214582 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-94xfz" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.246162 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:27 crc kubenswrapper[4766]: I1209 04:52:27.696340 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-94xfz"] Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.090046 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zx2fg"] Dec 09 04:52:28 crc kubenswrapper[4766]: W1209 04:52:28.100316 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0572d99b_c46f_4734_bf6f_72c01b8a8e84.slice/crio-fdf3b7c222a553afacbd80efff10701cb058514f8e00fcb9d64b1ebbdfc5480d WatchSource:0}: Error finding container fdf3b7c222a553afacbd80efff10701cb058514f8e00fcb9d64b1ebbdfc5480d: Status 404 returned error can't find the container with id fdf3b7c222a553afacbd80efff10701cb058514f8e00fcb9d64b1ebbdfc5480d Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.140891 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zx2fg" event={"ID":"0572d99b-c46f-4734-bf6f-72c01b8a8e84","Type":"ContainerStarted","Data":"fdf3b7c222a553afacbd80efff10701cb058514f8e00fcb9d64b1ebbdfc5480d"} Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.143110 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-94xfz" event={"ID":"7415e725-513c-4557-8d04-62a9c452ec6c","Type":"ContainerStarted","Data":"668811f6b8180049c62cfc0a38ee50f9b0f64eddff2213b3eadcfd3e764e1f19"} Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.143142 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-94xfz" event={"ID":"7415e725-513c-4557-8d04-62a9c452ec6c","Type":"ContainerStarted","Data":"8d5e1eea54582175d8449d019e7da5e2c2a94aebd2c8fb064b872715795995bc"} Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.144399 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-94xfz" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.455501 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-94xfz" podStartSLOduration=2.455477042 podStartE2EDuration="2.455477042s" podCreationTimestamp="2025-12-09 04:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:52:28.171500004 +0000 UTC m=+6029.880805430" watchObservedRunningTime="2025-12-09 04:52:28.455477042 +0000 UTC m=+6030.164782468" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.461553 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4kszh"] Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.463155 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.465803 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.474149 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4kszh"] Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.606372 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41fc5d08-4777-4c4a-b7f4-3efb34963689-config\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.606460 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/41fc5d08-4777-4c4a-b7f4-3efb34963689-ovs-rundir\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.606765 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/41fc5d08-4777-4c4a-b7f4-3efb34963689-ovn-rundir\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.606847 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrgc\" (UniqueName: \"kubernetes.io/projected/41fc5d08-4777-4c4a-b7f4-3efb34963689-kube-api-access-thrgc\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.708504 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41fc5d08-4777-4c4a-b7f4-3efb34963689-config\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.708840 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/41fc5d08-4777-4c4a-b7f4-3efb34963689-ovs-rundir\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.708909 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/41fc5d08-4777-4c4a-b7f4-3efb34963689-ovn-rundir\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.708936 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrgc\" (UniqueName: \"kubernetes.io/projected/41fc5d08-4777-4c4a-b7f4-3efb34963689-kube-api-access-thrgc\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.709191 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/41fc5d08-4777-4c4a-b7f4-3efb34963689-ovs-rundir\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.709207 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/41fc5d08-4777-4c4a-b7f4-3efb34963689-ovn-rundir\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.709384 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41fc5d08-4777-4c4a-b7f4-3efb34963689-config\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.728629 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrgc\" (UniqueName: \"kubernetes.io/projected/41fc5d08-4777-4c4a-b7f4-3efb34963689-kube-api-access-thrgc\") pod \"ovn-controller-metrics-4kszh\" (UID: \"41fc5d08-4777-4c4a-b7f4-3efb34963689\") " pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.787958 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4kszh" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.969638 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-2ckkq"] Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.971062 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:28 crc kubenswrapper[4766]: I1209 04:52:28.983561 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-2ckkq"] Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.123243 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4fc\" (UniqueName: \"kubernetes.io/projected/155b15f7-da34-4348-a75d-07da640ab348-kube-api-access-jb4fc\") pod \"octavia-db-create-2ckkq\" (UID: \"155b15f7-da34-4348-a75d-07da640ab348\") " pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.123424 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b15f7-da34-4348-a75d-07da640ab348-operator-scripts\") pod \"octavia-db-create-2ckkq\" (UID: \"155b15f7-da34-4348-a75d-07da640ab348\") " pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.154122 4766 generic.go:334] "Generic (PLEG): container finished" podID="0572d99b-c46f-4734-bf6f-72c01b8a8e84" containerID="8b59602076155ad38546fd7b792671e036931fcdc1a3fd7d8c779e01e4029ce6" exitCode=0 Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.154251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zx2fg" event={"ID":"0572d99b-c46f-4734-bf6f-72c01b8a8e84","Type":"ContainerDied","Data":"8b59602076155ad38546fd7b792671e036931fcdc1a3fd7d8c779e01e4029ce6"} Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.224778 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b15f7-da34-4348-a75d-07da640ab348-operator-scripts\") pod \"octavia-db-create-2ckkq\" (UID: \"155b15f7-da34-4348-a75d-07da640ab348\") " pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.224895 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4fc\" (UniqueName: \"kubernetes.io/projected/155b15f7-da34-4348-a75d-07da640ab348-kube-api-access-jb4fc\") pod \"octavia-db-create-2ckkq\" (UID: \"155b15f7-da34-4348-a75d-07da640ab348\") " pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.225894 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b15f7-da34-4348-a75d-07da640ab348-operator-scripts\") pod \"octavia-db-create-2ckkq\" (UID: \"155b15f7-da34-4348-a75d-07da640ab348\") " pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.247937 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4fc\" (UniqueName: \"kubernetes.io/projected/155b15f7-da34-4348-a75d-07da640ab348-kube-api-access-jb4fc\") pod \"octavia-db-create-2ckkq\" (UID: \"155b15f7-da34-4348-a75d-07da640ab348\") " pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:29 crc kubenswrapper[4766]: W1209 04:52:29.275634 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41fc5d08_4777_4c4a_b7f4_3efb34963689.slice/crio-3d3bb622d8df1b9df133f5ff8c146548d3efd3f61d339dc4951e3f601aedfb95 WatchSource:0}: Error finding container 3d3bb622d8df1b9df133f5ff8c146548d3efd3f61d339dc4951e3f601aedfb95: Status 404 returned error can't find the container with id 3d3bb622d8df1b9df133f5ff8c146548d3efd3f61d339dc4951e3f601aedfb95 Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.277128 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4kszh"] Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.299993 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:29 crc kubenswrapper[4766]: I1209 04:52:29.775367 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-2ckkq"] Dec 09 04:52:29 crc kubenswrapper[4766]: W1209 04:52:29.780641 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod155b15f7_da34_4348_a75d_07da640ab348.slice/crio-8054dadb68b1619862c6c36eb4ac742dc9285f88ca7615725ea9b09512eabe5f WatchSource:0}: Error finding container 8054dadb68b1619862c6c36eb4ac742dc9285f88ca7615725ea9b09512eabe5f: Status 404 returned error can't find the container with id 8054dadb68b1619862c6c36eb4ac742dc9285f88ca7615725ea9b09512eabe5f Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.165440 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zx2fg" event={"ID":"0572d99b-c46f-4734-bf6f-72c01b8a8e84","Type":"ContainerStarted","Data":"911cb41827e5534a3ea1613e44c3836ab1c119fa990125359e998f34e7cd2795"} Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.165767 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.165785 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zx2fg" event={"ID":"0572d99b-c46f-4734-bf6f-72c01b8a8e84","Type":"ContainerStarted","Data":"a25daa0f29ef40eeab11fedd22de1103fd3002df07e6dc95c1f42c0557068e5d"} Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.165800 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.168577 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4kszh" event={"ID":"41fc5d08-4777-4c4a-b7f4-3efb34963689","Type":"ContainerStarted","Data":"96b678b5e14d31afd492ead948fbec4835ac81615ba4d7d6341da3b855f03124"} Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.168631 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4kszh" event={"ID":"41fc5d08-4777-4c4a-b7f4-3efb34963689","Type":"ContainerStarted","Data":"3d3bb622d8df1b9df133f5ff8c146548d3efd3f61d339dc4951e3f601aedfb95"} Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.170278 4766 generic.go:334] "Generic (PLEG): container finished" podID="155b15f7-da34-4348-a75d-07da640ab348" containerID="bed691b430cf9b58e5f3ee65bb04ec9d4318a833b6ed67595404c0aa4283a3e7" exitCode=0 Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.170346 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-2ckkq" event={"ID":"155b15f7-da34-4348-a75d-07da640ab348","Type":"ContainerDied","Data":"bed691b430cf9b58e5f3ee65bb04ec9d4318a833b6ed67595404c0aa4283a3e7"} Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.170390 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-2ckkq" event={"ID":"155b15f7-da34-4348-a75d-07da640ab348","Type":"ContainerStarted","Data":"8054dadb68b1619862c6c36eb4ac742dc9285f88ca7615725ea9b09512eabe5f"} Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.190661 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zx2fg" podStartSLOduration=4.190644472 podStartE2EDuration="4.190644472s" podCreationTimestamp="2025-12-09 04:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:52:30.185506462 +0000 UTC m=+6031.894811898" watchObservedRunningTime="2025-12-09 04:52:30.190644472 +0000 UTC m=+6031.899949898" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.225446 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4kszh" podStartSLOduration=2.225427363 podStartE2EDuration="2.225427363s" podCreationTimestamp="2025-12-09 04:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:52:30.215185605 +0000 UTC m=+6031.924491031" watchObservedRunningTime="2025-12-09 04:52:30.225427363 +0000 UTC m=+6031.934732789" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.277936 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-ee4a-account-create-update-j9clj"] Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.281119 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.288366 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.297773 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-ee4a-account-create-update-j9clj"] Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.353467 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hj2v\" (UniqueName: \"kubernetes.io/projected/28d94708-9c8a-42d8-aa7c-4698518e046c-kube-api-access-5hj2v\") pod \"octavia-ee4a-account-create-update-j9clj\" (UID: \"28d94708-9c8a-42d8-aa7c-4698518e046c\") " pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.353837 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28d94708-9c8a-42d8-aa7c-4698518e046c-operator-scripts\") pod \"octavia-ee4a-account-create-update-j9clj\" (UID: \"28d94708-9c8a-42d8-aa7c-4698518e046c\") " pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.456050 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hj2v\" (UniqueName: \"kubernetes.io/projected/28d94708-9c8a-42d8-aa7c-4698518e046c-kube-api-access-5hj2v\") pod \"octavia-ee4a-account-create-update-j9clj\" (UID: \"28d94708-9c8a-42d8-aa7c-4698518e046c\") " pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.456190 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28d94708-9c8a-42d8-aa7c-4698518e046c-operator-scripts\") pod \"octavia-ee4a-account-create-update-j9clj\" (UID: \"28d94708-9c8a-42d8-aa7c-4698518e046c\") " pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.457194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28d94708-9c8a-42d8-aa7c-4698518e046c-operator-scripts\") pod \"octavia-ee4a-account-create-update-j9clj\" (UID: \"28d94708-9c8a-42d8-aa7c-4698518e046c\") " pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.478903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hj2v\" (UniqueName: \"kubernetes.io/projected/28d94708-9c8a-42d8-aa7c-4698518e046c-kube-api-access-5hj2v\") pod \"octavia-ee4a-account-create-update-j9clj\" (UID: \"28d94708-9c8a-42d8-aa7c-4698518e046c\") " pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:30 crc kubenswrapper[4766]: I1209 04:52:30.615378 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.046978 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jnf67"] Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.059297 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jnf67"] Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.107700 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-ee4a-account-create-update-j9clj"] Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.183236 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-ee4a-account-create-update-j9clj" event={"ID":"28d94708-9c8a-42d8-aa7c-4698518e046c","Type":"ContainerStarted","Data":"5d92f53cb259b9ebfc9ea2bcdd93c19e292058bfe603af224b9b5170b8f59cc9"} Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.587960 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.678793 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb4fc\" (UniqueName: \"kubernetes.io/projected/155b15f7-da34-4348-a75d-07da640ab348-kube-api-access-jb4fc\") pod \"155b15f7-da34-4348-a75d-07da640ab348\" (UID: \"155b15f7-da34-4348-a75d-07da640ab348\") " Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.679274 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b15f7-da34-4348-a75d-07da640ab348-operator-scripts\") pod \"155b15f7-da34-4348-a75d-07da640ab348\" (UID: \"155b15f7-da34-4348-a75d-07da640ab348\") " Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.679910 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/155b15f7-da34-4348-a75d-07da640ab348-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "155b15f7-da34-4348-a75d-07da640ab348" (UID: "155b15f7-da34-4348-a75d-07da640ab348"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.684861 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/155b15f7-da34-4348-a75d-07da640ab348-kube-api-access-jb4fc" (OuterVolumeSpecName: "kube-api-access-jb4fc") pod "155b15f7-da34-4348-a75d-07da640ab348" (UID: "155b15f7-da34-4348-a75d-07da640ab348"). InnerVolumeSpecName "kube-api-access-jb4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.781419 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb4fc\" (UniqueName: \"kubernetes.io/projected/155b15f7-da34-4348-a75d-07da640ab348-kube-api-access-jb4fc\") on node \"crc\" DevicePath \"\"" Dec 09 04:52:31 crc kubenswrapper[4766]: I1209 04:52:31.781454 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/155b15f7-da34-4348-a75d-07da640ab348-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:52:32 crc kubenswrapper[4766]: I1209 04:52:32.195726 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-2ckkq" event={"ID":"155b15f7-da34-4348-a75d-07da640ab348","Type":"ContainerDied","Data":"8054dadb68b1619862c6c36eb4ac742dc9285f88ca7615725ea9b09512eabe5f"} Dec 09 04:52:32 crc kubenswrapper[4766]: I1209 04:52:32.196110 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8054dadb68b1619862c6c36eb4ac742dc9285f88ca7615725ea9b09512eabe5f" Dec 09 04:52:32 crc kubenswrapper[4766]: I1209 04:52:32.195778 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-2ckkq" Dec 09 04:52:32 crc kubenswrapper[4766]: I1209 04:52:32.198272 4766 generic.go:334] "Generic (PLEG): container finished" podID="28d94708-9c8a-42d8-aa7c-4698518e046c" containerID="2414309a61531152f145b49721ab1afd5ad97f46d0d1e0f1cfbe9bfa8271a488" exitCode=0 Dec 09 04:52:32 crc kubenswrapper[4766]: I1209 04:52:32.198320 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-ee4a-account-create-update-j9clj" event={"ID":"28d94708-9c8a-42d8-aa7c-4698518e046c","Type":"ContainerDied","Data":"2414309a61531152f145b49721ab1afd5ad97f46d0d1e0f1cfbe9bfa8271a488"} Dec 09 04:52:32 crc kubenswrapper[4766]: I1209 04:52:32.863375 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68479e12-0d4b-48da-95d0-df49e38051e9" path="/var/lib/kubelet/pods/68479e12-0d4b-48da-95d0-df49e38051e9/volumes" Dec 09 04:52:33 crc kubenswrapper[4766]: I1209 04:52:33.664632 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:33 crc kubenswrapper[4766]: I1209 04:52:33.836981 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28d94708-9c8a-42d8-aa7c-4698518e046c-operator-scripts\") pod \"28d94708-9c8a-42d8-aa7c-4698518e046c\" (UID: \"28d94708-9c8a-42d8-aa7c-4698518e046c\") " Dec 09 04:52:33 crc kubenswrapper[4766]: I1209 04:52:33.837319 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hj2v\" (UniqueName: \"kubernetes.io/projected/28d94708-9c8a-42d8-aa7c-4698518e046c-kube-api-access-5hj2v\") pod \"28d94708-9c8a-42d8-aa7c-4698518e046c\" (UID: \"28d94708-9c8a-42d8-aa7c-4698518e046c\") " Dec 09 04:52:33 crc kubenswrapper[4766]: I1209 04:52:33.838163 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28d94708-9c8a-42d8-aa7c-4698518e046c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28d94708-9c8a-42d8-aa7c-4698518e046c" (UID: "28d94708-9c8a-42d8-aa7c-4698518e046c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:52:33 crc kubenswrapper[4766]: I1209 04:52:33.847534 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d94708-9c8a-42d8-aa7c-4698518e046c-kube-api-access-5hj2v" (OuterVolumeSpecName: "kube-api-access-5hj2v") pod "28d94708-9c8a-42d8-aa7c-4698518e046c" (UID: "28d94708-9c8a-42d8-aa7c-4698518e046c"). InnerVolumeSpecName "kube-api-access-5hj2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:52:33 crc kubenswrapper[4766]: I1209 04:52:33.940395 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28d94708-9c8a-42d8-aa7c-4698518e046c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:52:33 crc kubenswrapper[4766]: I1209 04:52:33.940442 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hj2v\" (UniqueName: \"kubernetes.io/projected/28d94708-9c8a-42d8-aa7c-4698518e046c-kube-api-access-5hj2v\") on node \"crc\" DevicePath \"\"" Dec 09 04:52:34 crc kubenswrapper[4766]: I1209 04:52:34.235251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-ee4a-account-create-update-j9clj" event={"ID":"28d94708-9c8a-42d8-aa7c-4698518e046c","Type":"ContainerDied","Data":"5d92f53cb259b9ebfc9ea2bcdd93c19e292058bfe603af224b9b5170b8f59cc9"} Dec 09 04:52:34 crc kubenswrapper[4766]: I1209 04:52:34.235581 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d92f53cb259b9ebfc9ea2bcdd93c19e292058bfe603af224b9b5170b8f59cc9" Dec 09 04:52:34 crc kubenswrapper[4766]: I1209 04:52:34.235815 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ee4a-account-create-update-j9clj" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.506961 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-grfpq"] Dec 09 04:52:35 crc kubenswrapper[4766]: E1209 04:52:35.507553 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155b15f7-da34-4348-a75d-07da640ab348" containerName="mariadb-database-create" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.507569 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="155b15f7-da34-4348-a75d-07da640ab348" containerName="mariadb-database-create" Dec 09 04:52:35 crc kubenswrapper[4766]: E1209 04:52:35.507590 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d94708-9c8a-42d8-aa7c-4698518e046c" containerName="mariadb-account-create-update" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.507598 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d94708-9c8a-42d8-aa7c-4698518e046c" containerName="mariadb-account-create-update" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.507840 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d94708-9c8a-42d8-aa7c-4698518e046c" containerName="mariadb-account-create-update" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.507876 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="155b15f7-da34-4348-a75d-07da640ab348" containerName="mariadb-database-create" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.508600 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.521405 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-grfpq"] Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.571434 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mv5b\" (UniqueName: \"kubernetes.io/projected/c5621b85-b6d5-48c6-911e-a49541e1cdf5-kube-api-access-7mv5b\") pod \"octavia-persistence-db-create-grfpq\" (UID: \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\") " pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.571523 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5621b85-b6d5-48c6-911e-a49541e1cdf5-operator-scripts\") pod \"octavia-persistence-db-create-grfpq\" (UID: \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\") " pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.673614 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mv5b\" (UniqueName: \"kubernetes.io/projected/c5621b85-b6d5-48c6-911e-a49541e1cdf5-kube-api-access-7mv5b\") pod \"octavia-persistence-db-create-grfpq\" (UID: \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\") " pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.673714 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5621b85-b6d5-48c6-911e-a49541e1cdf5-operator-scripts\") pod \"octavia-persistence-db-create-grfpq\" (UID: \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\") " pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.674459 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5621b85-b6d5-48c6-911e-a49541e1cdf5-operator-scripts\") pod \"octavia-persistence-db-create-grfpq\" (UID: \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\") " pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.695415 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mv5b\" (UniqueName: \"kubernetes.io/projected/c5621b85-b6d5-48c6-911e-a49541e1cdf5-kube-api-access-7mv5b\") pod \"octavia-persistence-db-create-grfpq\" (UID: \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\") " pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:35 crc kubenswrapper[4766]: I1209 04:52:35.833241 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:36 crc kubenswrapper[4766]: I1209 04:52:36.421402 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-grfpq"] Dec 09 04:52:36 crc kubenswrapper[4766]: W1209 04:52:36.425601 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5621b85_b6d5_48c6_911e_a49541e1cdf5.slice/crio-a5890cb1ecfd1dcb4e3d58db18028368961e2d19c5c260d2e6129a42e844be7b WatchSource:0}: Error finding container a5890cb1ecfd1dcb4e3d58db18028368961e2d19c5c260d2e6129a42e844be7b: Status 404 returned error can't find the container with id a5890cb1ecfd1dcb4e3d58db18028368961e2d19c5c260d2e6129a42e844be7b Dec 09 04:52:36 crc kubenswrapper[4766]: I1209 04:52:36.978878 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-bca3-account-create-update-rj692"] Dec 09 04:52:36 crc kubenswrapper[4766]: I1209 04:52:36.987147 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:36 crc kubenswrapper[4766]: I1209 04:52:36.988707 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-bca3-account-create-update-rj692"] Dec 09 04:52:36 crc kubenswrapper[4766]: I1209 04:52:36.989738 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.018919 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n45th\" (UniqueName: \"kubernetes.io/projected/43010a7a-c276-4355-9ed5-4280ca859180-kube-api-access-n45th\") pod \"octavia-bca3-account-create-update-rj692\" (UID: \"43010a7a-c276-4355-9ed5-4280ca859180\") " pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.020381 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43010a7a-c276-4355-9ed5-4280ca859180-operator-scripts\") pod \"octavia-bca3-account-create-update-rj692\" (UID: \"43010a7a-c276-4355-9ed5-4280ca859180\") " pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.122638 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43010a7a-c276-4355-9ed5-4280ca859180-operator-scripts\") pod \"octavia-bca3-account-create-update-rj692\" (UID: \"43010a7a-c276-4355-9ed5-4280ca859180\") " pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.122864 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n45th\" (UniqueName: \"kubernetes.io/projected/43010a7a-c276-4355-9ed5-4280ca859180-kube-api-access-n45th\") pod \"octavia-bca3-account-create-update-rj692\" (UID: \"43010a7a-c276-4355-9ed5-4280ca859180\") " pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.123442 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43010a7a-c276-4355-9ed5-4280ca859180-operator-scripts\") pod \"octavia-bca3-account-create-update-rj692\" (UID: \"43010a7a-c276-4355-9ed5-4280ca859180\") " pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.141072 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n45th\" (UniqueName: \"kubernetes.io/projected/43010a7a-c276-4355-9ed5-4280ca859180-kube-api-access-n45th\") pod \"octavia-bca3-account-create-update-rj692\" (UID: \"43010a7a-c276-4355-9ed5-4280ca859180\") " pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.266552 4766 generic.go:334] "Generic (PLEG): container finished" podID="c5621b85-b6d5-48c6-911e-a49541e1cdf5" containerID="ffac8514a004196856a6fe0bd742c3c5106d8aeb565454cd48d7702de07fbf57" exitCode=0 Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.266668 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-grfpq" event={"ID":"c5621b85-b6d5-48c6-911e-a49541e1cdf5","Type":"ContainerDied","Data":"ffac8514a004196856a6fe0bd742c3c5106d8aeb565454cd48d7702de07fbf57"} Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.267254 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-grfpq" event={"ID":"c5621b85-b6d5-48c6-911e-a49541e1cdf5","Type":"ContainerStarted","Data":"a5890cb1ecfd1dcb4e3d58db18028368961e2d19c5c260d2e6129a42e844be7b"} Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.307284 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:37 crc kubenswrapper[4766]: I1209 04:52:37.843996 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-bca3-account-create-update-rj692"] Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.282509 4766 generic.go:334] "Generic (PLEG): container finished" podID="43010a7a-c276-4355-9ed5-4280ca859180" containerID="8dec284693e16bf24947ddd894429d7d4594dd7dcb76e47b160d5ab5fab39c46" exitCode=0 Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.282553 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-bca3-account-create-update-rj692" event={"ID":"43010a7a-c276-4355-9ed5-4280ca859180","Type":"ContainerDied","Data":"8dec284693e16bf24947ddd894429d7d4594dd7dcb76e47b160d5ab5fab39c46"} Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.283012 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-bca3-account-create-update-rj692" event={"ID":"43010a7a-c276-4355-9ed5-4280ca859180","Type":"ContainerStarted","Data":"e1c3b195bbdf303d88c41a6e2a272b9c40b8f6753e9da787d89f9ab5c48452fe"} Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.720712 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.856882 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5621b85-b6d5-48c6-911e-a49541e1cdf5-operator-scripts\") pod \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\" (UID: \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\") " Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.857797 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mv5b\" (UniqueName: \"kubernetes.io/projected/c5621b85-b6d5-48c6-911e-a49541e1cdf5-kube-api-access-7mv5b\") pod \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\" (UID: \"c5621b85-b6d5-48c6-911e-a49541e1cdf5\") " Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.858090 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5621b85-b6d5-48c6-911e-a49541e1cdf5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5621b85-b6d5-48c6-911e-a49541e1cdf5" (UID: "c5621b85-b6d5-48c6-911e-a49541e1cdf5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.863824 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5621b85-b6d5-48c6-911e-a49541e1cdf5-kube-api-access-7mv5b" (OuterVolumeSpecName: "kube-api-access-7mv5b") pod "c5621b85-b6d5-48c6-911e-a49541e1cdf5" (UID: "c5621b85-b6d5-48c6-911e-a49541e1cdf5"). InnerVolumeSpecName "kube-api-access-7mv5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.960185 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mv5b\" (UniqueName: \"kubernetes.io/projected/c5621b85-b6d5-48c6-911e-a49541e1cdf5-kube-api-access-7mv5b\") on node \"crc\" DevicePath \"\"" Dec 09 04:52:38 crc kubenswrapper[4766]: I1209 04:52:38.960276 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5621b85-b6d5-48c6-911e-a49541e1cdf5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.298706 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-grfpq" event={"ID":"c5621b85-b6d5-48c6-911e-a49541e1cdf5","Type":"ContainerDied","Data":"a5890cb1ecfd1dcb4e3d58db18028368961e2d19c5c260d2e6129a42e844be7b"} Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.298778 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-grfpq" Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.298783 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5890cb1ecfd1dcb4e3d58db18028368961e2d19c5c260d2e6129a42e844be7b" Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.751321 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.878804 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43010a7a-c276-4355-9ed5-4280ca859180-operator-scripts\") pod \"43010a7a-c276-4355-9ed5-4280ca859180\" (UID: \"43010a7a-c276-4355-9ed5-4280ca859180\") " Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.878912 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n45th\" (UniqueName: \"kubernetes.io/projected/43010a7a-c276-4355-9ed5-4280ca859180-kube-api-access-n45th\") pod \"43010a7a-c276-4355-9ed5-4280ca859180\" (UID: \"43010a7a-c276-4355-9ed5-4280ca859180\") " Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.879357 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43010a7a-c276-4355-9ed5-4280ca859180-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43010a7a-c276-4355-9ed5-4280ca859180" (UID: "43010a7a-c276-4355-9ed5-4280ca859180"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.879724 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43010a7a-c276-4355-9ed5-4280ca859180-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.884672 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43010a7a-c276-4355-9ed5-4280ca859180-kube-api-access-n45th" (OuterVolumeSpecName: "kube-api-access-n45th") pod "43010a7a-c276-4355-9ed5-4280ca859180" (UID: "43010a7a-c276-4355-9ed5-4280ca859180"). InnerVolumeSpecName "kube-api-access-n45th". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:52:39 crc kubenswrapper[4766]: I1209 04:52:39.982473 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n45th\" (UniqueName: \"kubernetes.io/projected/43010a7a-c276-4355-9ed5-4280ca859180-kube-api-access-n45th\") on node \"crc\" DevicePath \"\"" Dec 09 04:52:40 crc kubenswrapper[4766]: I1209 04:52:40.315927 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-bca3-account-create-update-rj692" event={"ID":"43010a7a-c276-4355-9ed5-4280ca859180","Type":"ContainerDied","Data":"e1c3b195bbdf303d88c41a6e2a272b9c40b8f6753e9da787d89f9ab5c48452fe"} Dec 09 04:52:40 crc kubenswrapper[4766]: I1209 04:52:40.315978 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1c3b195bbdf303d88c41a6e2a272b9c40b8f6753e9da787d89f9ab5c48452fe" Dec 09 04:52:40 crc kubenswrapper[4766]: I1209 04:52:40.316058 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-bca3-account-create-update-rj692" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.086301 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-544d8574f4-hrpph"] Dec 09 04:52:43 crc kubenswrapper[4766]: E1209 04:52:43.086902 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5621b85-b6d5-48c6-911e-a49541e1cdf5" containerName="mariadb-database-create" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.086914 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5621b85-b6d5-48c6-911e-a49541e1cdf5" containerName="mariadb-database-create" Dec 09 04:52:43 crc kubenswrapper[4766]: E1209 04:52:43.086932 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43010a7a-c276-4355-9ed5-4280ca859180" containerName="mariadb-account-create-update" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.086938 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="43010a7a-c276-4355-9ed5-4280ca859180" containerName="mariadb-account-create-update" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.087114 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5621b85-b6d5-48c6-911e-a49541e1cdf5" containerName="mariadb-database-create" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.087139 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="43010a7a-c276-4355-9ed5-4280ca859180" containerName="mariadb-account-create-update" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.088721 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.096192 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.096494 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.096657 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-5vp48" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.113887 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-544d8574f4-hrpph"] Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.151312 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8178831-8e45-4026-9a23-d402c776fccd-config-data\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.151586 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e8178831-8e45-4026-9a23-d402c776fccd-config-data-merged\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.151722 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8178831-8e45-4026-9a23-d402c776fccd-combined-ca-bundle\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.151791 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8178831-8e45-4026-9a23-d402c776fccd-scripts\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.152115 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/e8178831-8e45-4026-9a23-d402c776fccd-octavia-run\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.252797 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8178831-8e45-4026-9a23-d402c776fccd-config-data\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.252864 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e8178831-8e45-4026-9a23-d402c776fccd-config-data-merged\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.252906 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8178831-8e45-4026-9a23-d402c776fccd-combined-ca-bundle\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.252923 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8178831-8e45-4026-9a23-d402c776fccd-scripts\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.252974 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/e8178831-8e45-4026-9a23-d402c776fccd-octavia-run\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.253459 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/e8178831-8e45-4026-9a23-d402c776fccd-octavia-run\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.253519 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e8178831-8e45-4026-9a23-d402c776fccd-config-data-merged\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.261472 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8178831-8e45-4026-9a23-d402c776fccd-combined-ca-bundle\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.261492 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8178831-8e45-4026-9a23-d402c776fccd-scripts\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.262353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8178831-8e45-4026-9a23-d402c776fccd-config-data\") pod \"octavia-api-544d8574f4-hrpph\" (UID: \"e8178831-8e45-4026-9a23-d402c776fccd\") " pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.417465 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:43 crc kubenswrapper[4766]: I1209 04:52:43.896959 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-544d8574f4-hrpph"] Dec 09 04:52:44 crc kubenswrapper[4766]: I1209 04:52:44.355316 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-544d8574f4-hrpph" event={"ID":"e8178831-8e45-4026-9a23-d402c776fccd","Type":"ContainerStarted","Data":"7f4244e4ff329ea2dc88ddc4798b84809f44124c1ea3f9a091776c4c92afe181"} Dec 09 04:52:45 crc kubenswrapper[4766]: I1209 04:52:45.032898 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-496cq"] Dec 09 04:52:45 crc kubenswrapper[4766]: I1209 04:52:45.051464 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-496cq"] Dec 09 04:52:46 crc kubenswrapper[4766]: I1209 04:52:46.853941 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403f07a0-cdb0-4cf1-8d04-9555b8e82fb6" path="/var/lib/kubelet/pods/403f07a0-cdb0-4cf1-8d04-9555b8e82fb6/volumes" Dec 09 04:52:53 crc kubenswrapper[4766]: I1209 04:52:53.447642 4766 generic.go:334] "Generic (PLEG): container finished" podID="e8178831-8e45-4026-9a23-d402c776fccd" containerID="d90bf3152b92e461d0929b94c7569cfee33d619eb1f323d2fb91b14b9e86bf46" exitCode=0 Dec 09 04:52:53 crc kubenswrapper[4766]: I1209 04:52:53.448121 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-544d8574f4-hrpph" event={"ID":"e8178831-8e45-4026-9a23-d402c776fccd","Type":"ContainerDied","Data":"d90bf3152b92e461d0929b94c7569cfee33d619eb1f323d2fb91b14b9e86bf46"} Dec 09 04:52:54 crc kubenswrapper[4766]: I1209 04:52:54.458753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-544d8574f4-hrpph" event={"ID":"e8178831-8e45-4026-9a23-d402c776fccd","Type":"ContainerStarted","Data":"b754504c7a39a9c7d4bde247dc83b1623c1e6235a87c7d90d06c297f7a2f0b4c"} Dec 09 04:52:54 crc kubenswrapper[4766]: I1209 04:52:54.459090 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-544d8574f4-hrpph" event={"ID":"e8178831-8e45-4026-9a23-d402c776fccd","Type":"ContainerStarted","Data":"22f65ec4d46467b6312427f640f0fbcaada28d160dc7c42a90fe785a37a5ad68"} Dec 09 04:52:54 crc kubenswrapper[4766]: I1209 04:52:54.459123 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:54 crc kubenswrapper[4766]: I1209 04:52:54.486950 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-544d8574f4-hrpph" podStartSLOduration=2.952046381 podStartE2EDuration="11.486928632s" podCreationTimestamp="2025-12-09 04:52:43 +0000 UTC" firstStartedPulling="2025-12-09 04:52:43.900123768 +0000 UTC m=+6045.609429194" lastFinishedPulling="2025-12-09 04:52:52.435006029 +0000 UTC m=+6054.144311445" observedRunningTime="2025-12-09 04:52:54.478407272 +0000 UTC m=+6056.187712718" watchObservedRunningTime="2025-12-09 04:52:54.486928632 +0000 UTC m=+6056.196234058" Dec 09 04:52:55 crc kubenswrapper[4766]: I1209 04:52:55.472901 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.632777 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-wh6j7"] Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.636289 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.638573 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.646754 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.647955 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.658839 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-wh6j7"] Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.837388 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d29a41da-acab-46d3-bc4c-df381d85613c-scripts\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.837532 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d29a41da-acab-46d3-bc4c-df381d85613c-hm-ports\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.837604 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d29a41da-acab-46d3-bc4c-df381d85613c-config-data\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.837633 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d29a41da-acab-46d3-bc4c-df381d85613c-config-data-merged\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.939487 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d29a41da-acab-46d3-bc4c-df381d85613c-scripts\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.939612 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d29a41da-acab-46d3-bc4c-df381d85613c-hm-ports\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.939664 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d29a41da-acab-46d3-bc4c-df381d85613c-config-data\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.939689 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d29a41da-acab-46d3-bc4c-df381d85613c-config-data-merged\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.940178 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d29a41da-acab-46d3-bc4c-df381d85613c-config-data-merged\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.940799 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d29a41da-acab-46d3-bc4c-df381d85613c-hm-ports\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.949243 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d29a41da-acab-46d3-bc4c-df381d85613c-scripts\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.959885 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d29a41da-acab-46d3-bc4c-df381d85613c-config-data\") pod \"octavia-rsyslog-wh6j7\" (UID: \"d29a41da-acab-46d3-bc4c-df381d85613c\") " pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:56 crc kubenswrapper[4766]: I1209 04:52:56.962544 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.362494 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vctfb"] Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.364497 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.367108 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.376755 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vctfb"] Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.525938 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-wh6j7"] Dec 09 04:52:57 crc kubenswrapper[4766]: W1209 04:52:57.546348 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd29a41da_acab_46d3_bc4c_df381d85613c.slice/crio-6f20c6d8cc23cc3358f2dbb3925cf40fb531e697c0a8c78af2578e345b120e55 WatchSource:0}: Error finding container 6f20c6d8cc23cc3358f2dbb3925cf40fb531e697c0a8c78af2578e345b120e55: Status 404 returned error can't find the container with id 6f20c6d8cc23cc3358f2dbb3925cf40fb531e697c0a8c78af2578e345b120e55 Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.561341 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/19123bff-e00b-4164-aae4-11ea44eb50e9-amphora-image\") pod \"octavia-image-upload-59f8cff499-vctfb\" (UID: \"19123bff-e00b-4164-aae4-11ea44eb50e9\") " pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.561483 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19123bff-e00b-4164-aae4-11ea44eb50e9-httpd-config\") pod \"octavia-image-upload-59f8cff499-vctfb\" (UID: \"19123bff-e00b-4164-aae4-11ea44eb50e9\") " pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.663339 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19123bff-e00b-4164-aae4-11ea44eb50e9-httpd-config\") pod \"octavia-image-upload-59f8cff499-vctfb\" (UID: \"19123bff-e00b-4164-aae4-11ea44eb50e9\") " pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.663440 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/19123bff-e00b-4164-aae4-11ea44eb50e9-amphora-image\") pod \"octavia-image-upload-59f8cff499-vctfb\" (UID: \"19123bff-e00b-4164-aae4-11ea44eb50e9\") " pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.663987 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/19123bff-e00b-4164-aae4-11ea44eb50e9-amphora-image\") pod \"octavia-image-upload-59f8cff499-vctfb\" (UID: \"19123bff-e00b-4164-aae4-11ea44eb50e9\") " pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.670965 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19123bff-e00b-4164-aae4-11ea44eb50e9-httpd-config\") pod \"octavia-image-upload-59f8cff499-vctfb\" (UID: \"19123bff-e00b-4164-aae4-11ea44eb50e9\") " pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:52:57 crc kubenswrapper[4766]: I1209 04:52:57.696132 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.197159 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vctfb"] Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.517186 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vctfb" event={"ID":"19123bff-e00b-4164-aae4-11ea44eb50e9","Type":"ContainerStarted","Data":"be9d4db6638e7b3114ba0618e2495314a26faa9538f010a41110d72fc92dd56c"} Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.519768 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wh6j7" event={"ID":"d29a41da-acab-46d3-bc4c-df381d85613c","Type":"ContainerStarted","Data":"6f20c6d8cc23cc3358f2dbb3925cf40fb531e697c0a8c78af2578e345b120e55"} Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.862738 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-bkxsh"] Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.864403 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.870270 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.899263 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-bkxsh"] Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.986050 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-scripts\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.986171 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-config-data\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.986225 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/019c97bc-64bc-436e-b815-72462b2df991-config-data-merged\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:58 crc kubenswrapper[4766]: I1209 04:52:58.986266 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-combined-ca-bundle\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.089483 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-config-data\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.089611 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/019c97bc-64bc-436e-b815-72462b2df991-config-data-merged\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.090513 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/019c97bc-64bc-436e-b815-72462b2df991-config-data-merged\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.091297 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-combined-ca-bundle\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.091749 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-scripts\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.095799 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-scripts\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.096285 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-combined-ca-bundle\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.096599 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-config-data\") pod \"octavia-db-sync-bkxsh\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.195780 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:52:59 crc kubenswrapper[4766]: I1209 04:52:59.747753 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-bkxsh"] Dec 09 04:53:00 crc kubenswrapper[4766]: I1209 04:53:00.545697 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bkxsh" event={"ID":"019c97bc-64bc-436e-b815-72462b2df991","Type":"ContainerStarted","Data":"c9021a6c93f2455e340e133f416eba108a5ebd4651351b865790ccc7f321afdb"} Dec 09 04:53:00 crc kubenswrapper[4766]: I1209 04:53:00.547703 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wh6j7" event={"ID":"d29a41da-acab-46d3-bc4c-df381d85613c","Type":"ContainerStarted","Data":"44ee506aec09ddf30984da015b547ebec2de42f7b2dc76e1777890f6bfdde6ca"} Dec 09 04:53:01 crc kubenswrapper[4766]: I1209 04:53:01.560264 4766 generic.go:334] "Generic (PLEG): container finished" podID="019c97bc-64bc-436e-b815-72462b2df991" containerID="37c36d6df7256898cd91261a5831c4e8d8aca76b323fbf27268333a26d339515" exitCode=0 Dec 09 04:53:01 crc kubenswrapper[4766]: I1209 04:53:01.560397 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bkxsh" event={"ID":"019c97bc-64bc-436e-b815-72462b2df991","Type":"ContainerDied","Data":"37c36d6df7256898cd91261a5831c4e8d8aca76b323fbf27268333a26d339515"} Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.260331 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-94xfz" podUID="7415e725-513c-4557-8d04-62a9c452ec6c" containerName="ovn-controller" probeResult="failure" output=< Dec 09 04:53:02 crc kubenswrapper[4766]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 09 04:53:02 crc kubenswrapper[4766]: > Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.292512 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.301130 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zx2fg" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.409895 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-94xfz-config-6zgs2"] Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.411528 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.415119 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.419763 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-94xfz-config-6zgs2"] Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.463735 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-scripts\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.463824 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-log-ovn\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.463856 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhs8t\" (UniqueName: \"kubernetes.io/projected/280769b5-b8c9-4474-87dd-a72e68bdf31a-kube-api-access-xhs8t\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.463894 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run-ovn\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.463989 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-additional-scripts\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.464203 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.566660 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-additional-scripts\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.566749 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.566794 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-scripts\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.566848 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-log-ovn\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.566868 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhs8t\" (UniqueName: \"kubernetes.io/projected/280769b5-b8c9-4474-87dd-a72e68bdf31a-kube-api-access-xhs8t\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.566909 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run-ovn\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.567101 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run-ovn\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.567131 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.567616 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-additional-scripts\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.568300 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-log-ovn\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.569055 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-scripts\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.572272 4766 generic.go:334] "Generic (PLEG): container finished" podID="d29a41da-acab-46d3-bc4c-df381d85613c" containerID="44ee506aec09ddf30984da015b547ebec2de42f7b2dc76e1777890f6bfdde6ca" exitCode=0 Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.572324 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wh6j7" event={"ID":"d29a41da-acab-46d3-bc4c-df381d85613c","Type":"ContainerDied","Data":"44ee506aec09ddf30984da015b547ebec2de42f7b2dc76e1777890f6bfdde6ca"} Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.587625 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhs8t\" (UniqueName: \"kubernetes.io/projected/280769b5-b8c9-4474-87dd-a72e68bdf31a-kube-api-access-xhs8t\") pod \"ovn-controller-94xfz-config-6zgs2\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:02 crc kubenswrapper[4766]: I1209 04:53:02.735406 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:03 crc kubenswrapper[4766]: I1209 04:53:03.000271 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:53:03 crc kubenswrapper[4766]: I1209 04:53:03.338525 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-94xfz-config-6zgs2"] Dec 09 04:53:03 crc kubenswrapper[4766]: I1209 04:53:03.586708 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-94xfz-config-6zgs2" event={"ID":"280769b5-b8c9-4474-87dd-a72e68bdf31a","Type":"ContainerStarted","Data":"46dfa322de6ce1a60683ab2c16ea4f33118b0e7c812f17170b110ec3cebfb808"} Dec 09 04:53:03 crc kubenswrapper[4766]: I1209 04:53:03.607859 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-544d8574f4-hrpph" Dec 09 04:53:04 crc kubenswrapper[4766]: I1209 04:53:04.601221 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bkxsh" event={"ID":"019c97bc-64bc-436e-b815-72462b2df991","Type":"ContainerStarted","Data":"1e11d5613abc857449f471091ec107b112e673f92506fe7fa0ab1c374b298be6"} Dec 09 04:53:04 crc kubenswrapper[4766]: I1209 04:53:04.605369 4766 generic.go:334] "Generic (PLEG): container finished" podID="280769b5-b8c9-4474-87dd-a72e68bdf31a" containerID="c69943f6e8283184ae7adbf4306bffabe68397b01a656d285816eca3b5b91910" exitCode=0 Dec 09 04:53:04 crc kubenswrapper[4766]: I1209 04:53:04.605404 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-94xfz-config-6zgs2" event={"ID":"280769b5-b8c9-4474-87dd-a72e68bdf31a","Type":"ContainerDied","Data":"c69943f6e8283184ae7adbf4306bffabe68397b01a656d285816eca3b5b91910"} Dec 09 04:53:04 crc kubenswrapper[4766]: I1209 04:53:04.627596 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-bkxsh" podStartSLOduration=6.627580295 podStartE2EDuration="6.627580295s" podCreationTimestamp="2025-12-09 04:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:53:04.619854235 +0000 UTC m=+6066.329159651" watchObservedRunningTime="2025-12-09 04:53:04.627580295 +0000 UTC m=+6066.336885721" Dec 09 04:53:07 crc kubenswrapper[4766]: I1209 04:53:07.260869 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-94xfz" Dec 09 04:53:07 crc kubenswrapper[4766]: I1209 04:53:07.661541 4766 generic.go:334] "Generic (PLEG): container finished" podID="019c97bc-64bc-436e-b815-72462b2df991" containerID="1e11d5613abc857449f471091ec107b112e673f92506fe7fa0ab1c374b298be6" exitCode=0 Dec 09 04:53:07 crc kubenswrapper[4766]: I1209 04:53:07.661595 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bkxsh" event={"ID":"019c97bc-64bc-436e-b815-72462b2df991","Type":"ContainerDied","Data":"1e11d5613abc857449f471091ec107b112e673f92506fe7fa0ab1c374b298be6"} Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.161434 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.203112 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run-ovn\") pod \"280769b5-b8c9-4474-87dd-a72e68bdf31a\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.203228 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-additional-scripts\") pod \"280769b5-b8c9-4474-87dd-a72e68bdf31a\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.203264 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "280769b5-b8c9-4474-87dd-a72e68bdf31a" (UID: "280769b5-b8c9-4474-87dd-a72e68bdf31a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.203283 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-scripts\") pod \"280769b5-b8c9-4474-87dd-a72e68bdf31a\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.203481 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run\") pod \"280769b5-b8c9-4474-87dd-a72e68bdf31a\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.203571 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhs8t\" (UniqueName: \"kubernetes.io/projected/280769b5-b8c9-4474-87dd-a72e68bdf31a-kube-api-access-xhs8t\") pod \"280769b5-b8c9-4474-87dd-a72e68bdf31a\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.203690 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-log-ovn\") pod \"280769b5-b8c9-4474-87dd-a72e68bdf31a\" (UID: \"280769b5-b8c9-4474-87dd-a72e68bdf31a\") " Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.203947 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run" (OuterVolumeSpecName: "var-run") pod "280769b5-b8c9-4474-87dd-a72e68bdf31a" (UID: "280769b5-b8c9-4474-87dd-a72e68bdf31a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.204082 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "280769b5-b8c9-4474-87dd-a72e68bdf31a" (UID: "280769b5-b8c9-4474-87dd-a72e68bdf31a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.204617 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "280769b5-b8c9-4474-87dd-a72e68bdf31a" (UID: "280769b5-b8c9-4474-87dd-a72e68bdf31a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.204800 4766 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.204819 4766 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.204828 4766 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.204838 4766 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/280769b5-b8c9-4474-87dd-a72e68bdf31a-var-run\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.205050 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-scripts" (OuterVolumeSpecName: "scripts") pod "280769b5-b8c9-4474-87dd-a72e68bdf31a" (UID: "280769b5-b8c9-4474-87dd-a72e68bdf31a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.234061 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280769b5-b8c9-4474-87dd-a72e68bdf31a-kube-api-access-xhs8t" (OuterVolumeSpecName: "kube-api-access-xhs8t") pod "280769b5-b8c9-4474-87dd-a72e68bdf31a" (UID: "280769b5-b8c9-4474-87dd-a72e68bdf31a"). InnerVolumeSpecName "kube-api-access-xhs8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.306914 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280769b5-b8c9-4474-87dd-a72e68bdf31a-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.306943 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhs8t\" (UniqueName: \"kubernetes.io/projected/280769b5-b8c9-4474-87dd-a72e68bdf31a-kube-api-access-xhs8t\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.677314 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-94xfz-config-6zgs2" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.678415 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-94xfz-config-6zgs2" event={"ID":"280769b5-b8c9-4474-87dd-a72e68bdf31a","Type":"ContainerDied","Data":"46dfa322de6ce1a60683ab2c16ea4f33118b0e7c812f17170b110ec3cebfb808"} Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.678482 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46dfa322de6ce1a60683ab2c16ea4f33118b0e7c812f17170b110ec3cebfb808" Dec 09 04:53:08 crc kubenswrapper[4766]: I1209 04:53:08.980910 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.024195 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-scripts\") pod \"019c97bc-64bc-436e-b815-72462b2df991\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.024520 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-config-data\") pod \"019c97bc-64bc-436e-b815-72462b2df991\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.024605 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/019c97bc-64bc-436e-b815-72462b2df991-config-data-merged\") pod \"019c97bc-64bc-436e-b815-72462b2df991\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.024749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-combined-ca-bundle\") pod \"019c97bc-64bc-436e-b815-72462b2df991\" (UID: \"019c97bc-64bc-436e-b815-72462b2df991\") " Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.029088 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-config-data" (OuterVolumeSpecName: "config-data") pod "019c97bc-64bc-436e-b815-72462b2df991" (UID: "019c97bc-64bc-436e-b815-72462b2df991"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.029313 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-scripts" (OuterVolumeSpecName: "scripts") pod "019c97bc-64bc-436e-b815-72462b2df991" (UID: "019c97bc-64bc-436e-b815-72462b2df991"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.059771 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/019c97bc-64bc-436e-b815-72462b2df991-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "019c97bc-64bc-436e-b815-72462b2df991" (UID: "019c97bc-64bc-436e-b815-72462b2df991"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.075359 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "019c97bc-64bc-436e-b815-72462b2df991" (UID: "019c97bc-64bc-436e-b815-72462b2df991"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.126627 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.126668 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.126684 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019c97bc-64bc-436e-b815-72462b2df991-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.126698 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/019c97bc-64bc-436e-b815-72462b2df991-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.249016 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-94xfz-config-6zgs2"] Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.258835 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-94xfz-config-6zgs2"] Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.687251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vctfb" event={"ID":"19123bff-e00b-4164-aae4-11ea44eb50e9","Type":"ContainerStarted","Data":"ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb"} Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.688904 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wh6j7" event={"ID":"d29a41da-acab-46d3-bc4c-df381d85613c","Type":"ContainerStarted","Data":"fdff55e3c99437cd1dbd315b77a7623b92b2b0b23a4409be93ca67d23a9d84f9"} Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.689523 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.692113 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bkxsh" event={"ID":"019c97bc-64bc-436e-b815-72462b2df991","Type":"ContainerDied","Data":"c9021a6c93f2455e340e133f416eba108a5ebd4651351b865790ccc7f321afdb"} Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.692252 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9021a6c93f2455e340e133f416eba108a5ebd4651351b865790ccc7f321afdb" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.692404 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bkxsh" Dec 09 04:53:09 crc kubenswrapper[4766]: I1209 04:53:09.747480 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-wh6j7" podStartSLOduration=2.864642663 podStartE2EDuration="13.74745708s" podCreationTimestamp="2025-12-09 04:52:56 +0000 UTC" firstStartedPulling="2025-12-09 04:52:57.549452317 +0000 UTC m=+6059.258757743" lastFinishedPulling="2025-12-09 04:53:08.432266724 +0000 UTC m=+6070.141572160" observedRunningTime="2025-12-09 04:53:09.726925354 +0000 UTC m=+6071.436230790" watchObservedRunningTime="2025-12-09 04:53:09.74745708 +0000 UTC m=+6071.456762516" Dec 09 04:53:10 crc kubenswrapper[4766]: I1209 04:53:10.859446 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280769b5-b8c9-4474-87dd-a72e68bdf31a" path="/var/lib/kubelet/pods/280769b5-b8c9-4474-87dd-a72e68bdf31a/volumes" Dec 09 04:53:12 crc kubenswrapper[4766]: I1209 04:53:12.740798 4766 generic.go:334] "Generic (PLEG): container finished" podID="19123bff-e00b-4164-aae4-11ea44eb50e9" containerID="ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb" exitCode=0 Dec 09 04:53:12 crc kubenswrapper[4766]: I1209 04:53:12.740895 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vctfb" event={"ID":"19123bff-e00b-4164-aae4-11ea44eb50e9","Type":"ContainerDied","Data":"ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb"} Dec 09 04:53:14 crc kubenswrapper[4766]: I1209 04:53:14.767690 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vctfb" event={"ID":"19123bff-e00b-4164-aae4-11ea44eb50e9","Type":"ContainerStarted","Data":"0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed"} Dec 09 04:53:14 crc kubenswrapper[4766]: I1209 04:53:14.806762 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-vctfb" podStartSLOduration=2.18482411 podStartE2EDuration="17.806732528s" podCreationTimestamp="2025-12-09 04:52:57 +0000 UTC" firstStartedPulling="2025-12-09 04:52:58.2031582 +0000 UTC m=+6059.912463626" lastFinishedPulling="2025-12-09 04:53:13.825066588 +0000 UTC m=+6075.534372044" observedRunningTime="2025-12-09 04:53:14.786192872 +0000 UTC m=+6076.495498308" watchObservedRunningTime="2025-12-09 04:53:14.806732528 +0000 UTC m=+6076.516037964" Dec 09 04:53:15 crc kubenswrapper[4766]: I1209 04:53:15.042879 4766 scope.go:117] "RemoveContainer" containerID="87d67a8d055c3618a15471b1f04db4aaf95d84725ebb64534b3d9a95061cf88d" Dec 09 04:53:15 crc kubenswrapper[4766]: I1209 04:53:15.066044 4766 scope.go:117] "RemoveContainer" containerID="1463142f4d27b9e80410c793f1fcd304c72c10e11410ad6b8705635932913395" Dec 09 04:53:15 crc kubenswrapper[4766]: I1209 04:53:15.176612 4766 scope.go:117] "RemoveContainer" containerID="11911ded66752f771d6ad4fa421c3258a74bdbf1afcb8da093e75a37d9112b17" Dec 09 04:53:15 crc kubenswrapper[4766]: I1209 04:53:15.201533 4766 scope.go:117] "RemoveContainer" containerID="4727821be4240c408efe814f03824c0895de2af772d2a3249fb54d395992ea1a" Dec 09 04:53:27 crc kubenswrapper[4766]: I1209 04:53:27.004252 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-wh6j7" Dec 09 04:53:33 crc kubenswrapper[4766]: I1209 04:53:33.778668 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vctfb"] Dec 09 04:53:33 crc kubenswrapper[4766]: I1209 04:53:33.779778 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-vctfb" podUID="19123bff-e00b-4164-aae4-11ea44eb50e9" containerName="octavia-amphora-httpd" containerID="cri-o://0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed" gracePeriod=30 Dec 09 04:53:34 crc kubenswrapper[4766]: I1209 04:53:34.444351 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:53:34 crc kubenswrapper[4766]: I1209 04:53:34.642082 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19123bff-e00b-4164-aae4-11ea44eb50e9-httpd-config\") pod \"19123bff-e00b-4164-aae4-11ea44eb50e9\" (UID: \"19123bff-e00b-4164-aae4-11ea44eb50e9\") " Dec 09 04:53:34 crc kubenswrapper[4766]: I1209 04:53:34.642185 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/19123bff-e00b-4164-aae4-11ea44eb50e9-amphora-image\") pod \"19123bff-e00b-4164-aae4-11ea44eb50e9\" (UID: \"19123bff-e00b-4164-aae4-11ea44eb50e9\") " Dec 09 04:53:34 crc kubenswrapper[4766]: I1209 04:53:34.677848 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19123bff-e00b-4164-aae4-11ea44eb50e9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "19123bff-e00b-4164-aae4-11ea44eb50e9" (UID: "19123bff-e00b-4164-aae4-11ea44eb50e9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:53:34 crc kubenswrapper[4766]: I1209 04:53:34.702531 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19123bff-e00b-4164-aae4-11ea44eb50e9-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "19123bff-e00b-4164-aae4-11ea44eb50e9" (UID: "19123bff-e00b-4164-aae4-11ea44eb50e9"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:53:34 crc kubenswrapper[4766]: I1209 04:53:34.749708 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/19123bff-e00b-4164-aae4-11ea44eb50e9-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:34 crc kubenswrapper[4766]: I1209 04:53:34.750081 4766 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/19123bff-e00b-4164-aae4-11ea44eb50e9-amphora-image\") on node \"crc\" DevicePath \"\"" Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.025931 4766 generic.go:334] "Generic (PLEG): container finished" podID="19123bff-e00b-4164-aae4-11ea44eb50e9" containerID="0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed" exitCode=0 Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.025978 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vctfb" event={"ID":"19123bff-e00b-4164-aae4-11ea44eb50e9","Type":"ContainerDied","Data":"0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed"} Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.026021 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vctfb" event={"ID":"19123bff-e00b-4164-aae4-11ea44eb50e9","Type":"ContainerDied","Data":"be9d4db6638e7b3114ba0618e2495314a26faa9538f010a41110d72fc92dd56c"} Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.026034 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vctfb" Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.026039 4766 scope.go:117] "RemoveContainer" containerID="0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed" Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.066144 4766 scope.go:117] "RemoveContainer" containerID="ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb" Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.079945 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vctfb"] Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.097300 4766 scope.go:117] "RemoveContainer" containerID="0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed" Dec 09 04:53:35 crc kubenswrapper[4766]: E1209 04:53:35.099630 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed\": container with ID starting with 0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed not found: ID does not exist" containerID="0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed" Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.099827 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed"} err="failed to get container status \"0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed\": rpc error: code = NotFound desc = could not find container \"0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed\": container with ID starting with 0293acb01fe38ec55559d3c85516758f36b0903f1819953425177847b48d7bed not found: ID does not exist" Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.100008 4766 scope.go:117] "RemoveContainer" containerID="ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb" Dec 09 04:53:35 crc kubenswrapper[4766]: E1209 04:53:35.100517 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb\": container with ID starting with ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb not found: ID does not exist" containerID="ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb" Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.100686 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb"} err="failed to get container status \"ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb\": rpc error: code = NotFound desc = could not find container \"ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb\": container with ID starting with ebe82f01347ce44b416883d618d78bf1839b1f0bd8799dd2d395ccb93f1910cb not found: ID does not exist" Dec 09 04:53:35 crc kubenswrapper[4766]: I1209 04:53:35.107118 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vctfb"] Dec 09 04:53:36 crc kubenswrapper[4766]: I1209 04:53:36.851472 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19123bff-e00b-4164-aae4-11ea44eb50e9" path="/var/lib/kubelet/pods/19123bff-e00b-4164-aae4-11ea44eb50e9/volumes" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.315967 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.316301 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.616322 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-46tzk"] Dec 09 04:53:37 crc kubenswrapper[4766]: E1209 04:53:37.616934 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19123bff-e00b-4164-aae4-11ea44eb50e9" containerName="init" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.616983 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="19123bff-e00b-4164-aae4-11ea44eb50e9" containerName="init" Dec 09 04:53:37 crc kubenswrapper[4766]: E1209 04:53:37.617014 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19123bff-e00b-4164-aae4-11ea44eb50e9" containerName="octavia-amphora-httpd" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.617029 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="19123bff-e00b-4164-aae4-11ea44eb50e9" containerName="octavia-amphora-httpd" Dec 09 04:53:37 crc kubenswrapper[4766]: E1209 04:53:37.617056 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280769b5-b8c9-4474-87dd-a72e68bdf31a" containerName="ovn-config" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.617070 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="280769b5-b8c9-4474-87dd-a72e68bdf31a" containerName="ovn-config" Dec 09 04:53:37 crc kubenswrapper[4766]: E1209 04:53:37.617090 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019c97bc-64bc-436e-b815-72462b2df991" containerName="octavia-db-sync" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.617103 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="019c97bc-64bc-436e-b815-72462b2df991" containerName="octavia-db-sync" Dec 09 04:53:37 crc kubenswrapper[4766]: E1209 04:53:37.617131 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019c97bc-64bc-436e-b815-72462b2df991" containerName="init" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.617145 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="019c97bc-64bc-436e-b815-72462b2df991" containerName="init" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.617553 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="019c97bc-64bc-436e-b815-72462b2df991" containerName="octavia-db-sync" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.617611 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="280769b5-b8c9-4474-87dd-a72e68bdf31a" containerName="ovn-config" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.617627 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="19123bff-e00b-4164-aae4-11ea44eb50e9" containerName="octavia-amphora-httpd" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.621250 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-46tzk" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.633706 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-46tzk"] Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.640543 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.814738 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/26760c25-6203-40b2-8302-23310f358a65-httpd-config\") pod \"octavia-image-upload-59f8cff499-46tzk\" (UID: \"26760c25-6203-40b2-8302-23310f358a65\") " pod="openstack/octavia-image-upload-59f8cff499-46tzk" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.815575 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/26760c25-6203-40b2-8302-23310f358a65-amphora-image\") pod \"octavia-image-upload-59f8cff499-46tzk\" (UID: \"26760c25-6203-40b2-8302-23310f358a65\") " pod="openstack/octavia-image-upload-59f8cff499-46tzk" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.917175 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/26760c25-6203-40b2-8302-23310f358a65-amphora-image\") pod \"octavia-image-upload-59f8cff499-46tzk\" (UID: \"26760c25-6203-40b2-8302-23310f358a65\") " pod="openstack/octavia-image-upload-59f8cff499-46tzk" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.917318 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/26760c25-6203-40b2-8302-23310f358a65-httpd-config\") pod \"octavia-image-upload-59f8cff499-46tzk\" (UID: \"26760c25-6203-40b2-8302-23310f358a65\") " pod="openstack/octavia-image-upload-59f8cff499-46tzk" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.917914 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/26760c25-6203-40b2-8302-23310f358a65-amphora-image\") pod \"octavia-image-upload-59f8cff499-46tzk\" (UID: \"26760c25-6203-40b2-8302-23310f358a65\") " pod="openstack/octavia-image-upload-59f8cff499-46tzk" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.923010 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/26760c25-6203-40b2-8302-23310f358a65-httpd-config\") pod \"octavia-image-upload-59f8cff499-46tzk\" (UID: \"26760c25-6203-40b2-8302-23310f358a65\") " pod="openstack/octavia-image-upload-59f8cff499-46tzk" Dec 09 04:53:37 crc kubenswrapper[4766]: I1209 04:53:37.969720 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-46tzk" Dec 09 04:53:38 crc kubenswrapper[4766]: I1209 04:53:38.495014 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-46tzk"] Dec 09 04:53:38 crc kubenswrapper[4766]: W1209 04:53:38.508006 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26760c25_6203_40b2_8302_23310f358a65.slice/crio-437f56c63150729113dbb309d250f613388d0c8e3b16a2d8c7cfa0afa1991334 WatchSource:0}: Error finding container 437f56c63150729113dbb309d250f613388d0c8e3b16a2d8c7cfa0afa1991334: Status 404 returned error can't find the container with id 437f56c63150729113dbb309d250f613388d0c8e3b16a2d8c7cfa0afa1991334 Dec 09 04:53:39 crc kubenswrapper[4766]: I1209 04:53:39.081888 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-46tzk" event={"ID":"26760c25-6203-40b2-8302-23310f358a65","Type":"ContainerStarted","Data":"437f56c63150729113dbb309d250f613388d0c8e3b16a2d8c7cfa0afa1991334"} Dec 09 04:53:40 crc kubenswrapper[4766]: I1209 04:53:40.096581 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-46tzk" event={"ID":"26760c25-6203-40b2-8302-23310f358a65","Type":"ContainerStarted","Data":"35532d07f534f26e0e9ef118ff4d47b7014f73dee5ffa8f9c777e49760d8040c"} Dec 09 04:53:41 crc kubenswrapper[4766]: I1209 04:53:41.107682 4766 generic.go:334] "Generic (PLEG): container finished" podID="26760c25-6203-40b2-8302-23310f358a65" containerID="35532d07f534f26e0e9ef118ff4d47b7014f73dee5ffa8f9c777e49760d8040c" exitCode=0 Dec 09 04:53:41 crc kubenswrapper[4766]: I1209 04:53:41.107845 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-46tzk" event={"ID":"26760c25-6203-40b2-8302-23310f358a65","Type":"ContainerDied","Data":"35532d07f534f26e0e9ef118ff4d47b7014f73dee5ffa8f9c777e49760d8040c"} Dec 09 04:53:42 crc kubenswrapper[4766]: I1209 04:53:42.121982 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-46tzk" event={"ID":"26760c25-6203-40b2-8302-23310f358a65","Type":"ContainerStarted","Data":"7456d68a64a410e760e9130bca986e93ae372d6ae0f54427063186b160707a10"} Dec 09 04:53:42 crc kubenswrapper[4766]: I1209 04:53:42.154721 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-46tzk" podStartSLOduration=2.167041079 podStartE2EDuration="5.154700379s" podCreationTimestamp="2025-12-09 04:53:37 +0000 UTC" firstStartedPulling="2025-12-09 04:53:38.523382747 +0000 UTC m=+6100.232688173" lastFinishedPulling="2025-12-09 04:53:41.511042057 +0000 UTC m=+6103.220347473" observedRunningTime="2025-12-09 04:53:42.14180759 +0000 UTC m=+6103.851113016" watchObservedRunningTime="2025-12-09 04:53:42.154700379 +0000 UTC m=+6103.864005825" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.340626 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-w75w9"] Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.344840 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.360044 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.360256 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.360782 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.363378 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-w75w9"] Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.517098 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-scripts\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.517435 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-config-data\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.517475 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/42c16a26-9345-40e9-b970-01eadcb1e1cf-config-data-merged\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.517492 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/42c16a26-9345-40e9-b970-01eadcb1e1cf-hm-ports\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.517631 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-combined-ca-bundle\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.517657 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-amphora-certs\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.619410 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-combined-ca-bundle\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.619481 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-amphora-certs\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.619529 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-scripts\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.619565 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-config-data\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.619608 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/42c16a26-9345-40e9-b970-01eadcb1e1cf-config-data-merged\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.619628 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/42c16a26-9345-40e9-b970-01eadcb1e1cf-hm-ports\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.621136 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/42c16a26-9345-40e9-b970-01eadcb1e1cf-config-data-merged\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.621167 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/42c16a26-9345-40e9-b970-01eadcb1e1cf-hm-ports\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.627319 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-scripts\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.627491 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-amphora-certs\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.628419 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-config-data\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.648616 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c16a26-9345-40e9-b970-01eadcb1e1cf-combined-ca-bundle\") pod \"octavia-healthmanager-w75w9\" (UID: \"42c16a26-9345-40e9-b970-01eadcb1e1cf\") " pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:00 crc kubenswrapper[4766]: I1209 04:54:00.680521 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:01 crc kubenswrapper[4766]: I1209 04:54:01.228817 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-w75w9"] Dec 09 04:54:01 crc kubenswrapper[4766]: W1209 04:54:01.234864 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42c16a26_9345_40e9_b970_01eadcb1e1cf.slice/crio-a9e48e8eaf5aa0f2d95cf96d75b4e1a7df2489581408492a74dba206ac3d9686 WatchSource:0}: Error finding container a9e48e8eaf5aa0f2d95cf96d75b4e1a7df2489581408492a74dba206ac3d9686: Status 404 returned error can't find the container with id a9e48e8eaf5aa0f2d95cf96d75b4e1a7df2489581408492a74dba206ac3d9686 Dec 09 04:54:01 crc kubenswrapper[4766]: I1209 04:54:01.370758 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-w75w9" event={"ID":"42c16a26-9345-40e9-b970-01eadcb1e1cf","Type":"ContainerStarted","Data":"a9e48e8eaf5aa0f2d95cf96d75b4e1a7df2489581408492a74dba206ac3d9686"} Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.207587 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-b69rx"] Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.211032 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.213768 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.213925 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.236783 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-b69rx"] Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.353375 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-combined-ca-bundle\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.353528 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/707b62f5-7629-49cd-9cc3-a2a73d21ea70-config-data-merged\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.353565 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/707b62f5-7629-49cd-9cc3-a2a73d21ea70-hm-ports\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.353907 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-amphora-certs\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.353987 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-scripts\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.354070 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-config-data\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.380258 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-w75w9" event={"ID":"42c16a26-9345-40e9-b970-01eadcb1e1cf","Type":"ContainerStarted","Data":"166c2e52584d0bf8c357b089ca156032a1cf5833fc4d476214fce74a17f9064d"} Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.455829 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/707b62f5-7629-49cd-9cc3-a2a73d21ea70-config-data-merged\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.455909 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/707b62f5-7629-49cd-9cc3-a2a73d21ea70-hm-ports\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.456067 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-amphora-certs\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.456112 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-scripts\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.456168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-config-data\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.456264 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-combined-ca-bundle\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.456384 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/707b62f5-7629-49cd-9cc3-a2a73d21ea70-config-data-merged\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.457457 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/707b62f5-7629-49cd-9cc3-a2a73d21ea70-hm-ports\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.462789 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-config-data\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.462873 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-scripts\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.463426 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-combined-ca-bundle\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.468012 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/707b62f5-7629-49cd-9cc3-a2a73d21ea70-amphora-certs\") pod \"octavia-housekeeping-b69rx\" (UID: \"707b62f5-7629-49cd-9cc3-a2a73d21ea70\") " pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:02 crc kubenswrapper[4766]: I1209 04:54:02.538987 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.229435 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-b69rx"] Dec 09 04:54:03 crc kubenswrapper[4766]: W1209 04:54:03.239301 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod707b62f5_7629_49cd_9cc3_a2a73d21ea70.slice/crio-f9da79401351a59ca7c84cd5af8cb5c33e14c7649ab040dcb517a81421e1ea3f WatchSource:0}: Error finding container f9da79401351a59ca7c84cd5af8cb5c33e14c7649ab040dcb517a81421e1ea3f: Status 404 returned error can't find the container with id f9da79401351a59ca7c84cd5af8cb5c33e14c7649ab040dcb517a81421e1ea3f Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.414167 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-b69rx" event={"ID":"707b62f5-7629-49cd-9cc3-a2a73d21ea70","Type":"ContainerStarted","Data":"f9da79401351a59ca7c84cd5af8cb5c33e14c7649ab040dcb517a81421e1ea3f"} Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.599703 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-78vr2"] Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.601760 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.603592 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.603691 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.630595 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-78vr2"] Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.684853 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ae9d7a5e-caed-457f-9a5c-56faf7db6547-hm-ports\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.684927 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ae9d7a5e-caed-457f-9a5c-56faf7db6547-config-data-merged\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.684989 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-combined-ca-bundle\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.685065 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-config-data\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.685104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-amphora-certs\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.685149 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-scripts\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.786386 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ae9d7a5e-caed-457f-9a5c-56faf7db6547-hm-ports\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.786643 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ae9d7a5e-caed-457f-9a5c-56faf7db6547-config-data-merged\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.786684 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-combined-ca-bundle\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.786717 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-config-data\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.786738 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-amphora-certs\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.786767 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-scripts\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.787102 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ae9d7a5e-caed-457f-9a5c-56faf7db6547-config-data-merged\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.788375 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ae9d7a5e-caed-457f-9a5c-56faf7db6547-hm-ports\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.792669 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-config-data\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.795132 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-combined-ca-bundle\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.803787 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-amphora-certs\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.815673 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9d7a5e-caed-457f-9a5c-56faf7db6547-scripts\") pod \"octavia-worker-78vr2\" (UID: \"ae9d7a5e-caed-457f-9a5c-56faf7db6547\") " pod="openstack/octavia-worker-78vr2" Dec 09 04:54:03 crc kubenswrapper[4766]: I1209 04:54:03.946730 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-78vr2" Dec 09 04:54:04 crc kubenswrapper[4766]: I1209 04:54:04.462551 4766 generic.go:334] "Generic (PLEG): container finished" podID="42c16a26-9345-40e9-b970-01eadcb1e1cf" containerID="166c2e52584d0bf8c357b089ca156032a1cf5833fc4d476214fce74a17f9064d" exitCode=0 Dec 09 04:54:04 crc kubenswrapper[4766]: I1209 04:54:04.462737 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-w75w9" event={"ID":"42c16a26-9345-40e9-b970-01eadcb1e1cf","Type":"ContainerDied","Data":"166c2e52584d0bf8c357b089ca156032a1cf5833fc4d476214fce74a17f9064d"} Dec 09 04:54:04 crc kubenswrapper[4766]: I1209 04:54:04.561632 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-78vr2"] Dec 09 04:54:04 crc kubenswrapper[4766]: W1209 04:54:04.581059 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae9d7a5e_caed_457f_9a5c_56faf7db6547.slice/crio-e7c58108cd3e923cc35e7af73f1a9c3e50bfcd8d00054160150b4e3cc3d30bdf WatchSource:0}: Error finding container e7c58108cd3e923cc35e7af73f1a9c3e50bfcd8d00054160150b4e3cc3d30bdf: Status 404 returned error can't find the container with id e7c58108cd3e923cc35e7af73f1a9c3e50bfcd8d00054160150b4e3cc3d30bdf Dec 09 04:54:05 crc kubenswrapper[4766]: I1209 04:54:05.480856 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-w75w9" event={"ID":"42c16a26-9345-40e9-b970-01eadcb1e1cf","Type":"ContainerStarted","Data":"b64731e69242fabd7a6d988d4cbfbc7aa836110e3809de203c5dc017d51b968e"} Dec 09 04:54:05 crc kubenswrapper[4766]: I1209 04:54:05.482075 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:05 crc kubenswrapper[4766]: I1209 04:54:05.483673 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-78vr2" event={"ID":"ae9d7a5e-caed-457f-9a5c-56faf7db6547","Type":"ContainerStarted","Data":"e7c58108cd3e923cc35e7af73f1a9c3e50bfcd8d00054160150b4e3cc3d30bdf"} Dec 09 04:54:05 crc kubenswrapper[4766]: I1209 04:54:05.492125 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-b69rx" event={"ID":"707b62f5-7629-49cd-9cc3-a2a73d21ea70","Type":"ContainerStarted","Data":"2f63a33aeb3fd6fba46ffe187ea50d83b67f8891148d7e82d00721e632bba2c4"} Dec 09 04:54:05 crc kubenswrapper[4766]: I1209 04:54:05.532156 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-w75w9" podStartSLOduration=5.532133075 podStartE2EDuration="5.532133075s" podCreationTimestamp="2025-12-09 04:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:54:05.504883158 +0000 UTC m=+6127.214188584" watchObservedRunningTime="2025-12-09 04:54:05.532133075 +0000 UTC m=+6127.241438501" Dec 09 04:54:06 crc kubenswrapper[4766]: I1209 04:54:06.507617 4766 generic.go:334] "Generic (PLEG): container finished" podID="707b62f5-7629-49cd-9cc3-a2a73d21ea70" containerID="2f63a33aeb3fd6fba46ffe187ea50d83b67f8891148d7e82d00721e632bba2c4" exitCode=0 Dec 09 04:54:06 crc kubenswrapper[4766]: I1209 04:54:06.508161 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-b69rx" event={"ID":"707b62f5-7629-49cd-9cc3-a2a73d21ea70","Type":"ContainerDied","Data":"2f63a33aeb3fd6fba46ffe187ea50d83b67f8891148d7e82d00721e632bba2c4"} Dec 09 04:54:07 crc kubenswrapper[4766]: I1209 04:54:07.316052 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:54:07 crc kubenswrapper[4766]: I1209 04:54:07.316610 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:54:07 crc kubenswrapper[4766]: I1209 04:54:07.519955 4766 generic.go:334] "Generic (PLEG): container finished" podID="ae9d7a5e-caed-457f-9a5c-56faf7db6547" containerID="0e448cd406b872d32e77505ef08823282bebdb2b42af13a06157910d84536c0f" exitCode=0 Dec 09 04:54:07 crc kubenswrapper[4766]: I1209 04:54:07.520023 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-78vr2" event={"ID":"ae9d7a5e-caed-457f-9a5c-56faf7db6547","Type":"ContainerDied","Data":"0e448cd406b872d32e77505ef08823282bebdb2b42af13a06157910d84536c0f"} Dec 09 04:54:07 crc kubenswrapper[4766]: I1209 04:54:07.524569 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-b69rx" event={"ID":"707b62f5-7629-49cd-9cc3-a2a73d21ea70","Type":"ContainerStarted","Data":"b2723428f347c8bed79c652ada7fadaf1b441c8b4b75bfcb3c1db6bf1a3c1b1c"} Dec 09 04:54:07 crc kubenswrapper[4766]: I1209 04:54:07.524876 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:07 crc kubenswrapper[4766]: I1209 04:54:07.584454 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-b69rx" podStartSLOduration=4.305244226 podStartE2EDuration="5.584435359s" podCreationTimestamp="2025-12-09 04:54:02 +0000 UTC" firstStartedPulling="2025-12-09 04:54:03.243494692 +0000 UTC m=+6124.952800118" lastFinishedPulling="2025-12-09 04:54:04.522685825 +0000 UTC m=+6126.231991251" observedRunningTime="2025-12-09 04:54:07.570733359 +0000 UTC m=+6129.280038795" watchObservedRunningTime="2025-12-09 04:54:07.584435359 +0000 UTC m=+6129.293740775" Dec 09 04:54:08 crc kubenswrapper[4766]: I1209 04:54:08.536282 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-78vr2" event={"ID":"ae9d7a5e-caed-457f-9a5c-56faf7db6547","Type":"ContainerStarted","Data":"295e6ffd5dd71d7091b168347fa5a9e92040a451f4363ad65fdacac1a5987b22"} Dec 09 04:54:08 crc kubenswrapper[4766]: I1209 04:54:08.561330 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-78vr2" podStartSLOduration=4.029034693 podStartE2EDuration="5.561296888s" podCreationTimestamp="2025-12-09 04:54:03 +0000 UTC" firstStartedPulling="2025-12-09 04:54:04.584090614 +0000 UTC m=+6126.293396040" lastFinishedPulling="2025-12-09 04:54:06.116352809 +0000 UTC m=+6127.825658235" observedRunningTime="2025-12-09 04:54:08.556393996 +0000 UTC m=+6130.265699492" watchObservedRunningTime="2025-12-09 04:54:08.561296888 +0000 UTC m=+6130.270602354" Dec 09 04:54:09 crc kubenswrapper[4766]: I1209 04:54:09.543689 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-78vr2" Dec 09 04:54:15 crc kubenswrapper[4766]: I1209 04:54:15.719074 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-w75w9" Dec 09 04:54:17 crc kubenswrapper[4766]: I1209 04:54:17.590638 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-b69rx" Dec 09 04:54:18 crc kubenswrapper[4766]: I1209 04:54:18.978599 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-78vr2" Dec 09 04:54:37 crc kubenswrapper[4766]: I1209 04:54:37.316540 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:54:37 crc kubenswrapper[4766]: I1209 04:54:37.317109 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:54:37 crc kubenswrapper[4766]: I1209 04:54:37.317164 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:54:37 crc kubenswrapper[4766]: I1209 04:54:37.318057 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0fb3a256512499ae16934c9ba9ee1a59382e5cd9e65f810fb30d76fbdefbf701"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:54:37 crc kubenswrapper[4766]: I1209 04:54:37.318114 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://0fb3a256512499ae16934c9ba9ee1a59382e5cd9e65f810fb30d76fbdefbf701" gracePeriod=600 Dec 09 04:54:37 crc kubenswrapper[4766]: I1209 04:54:37.868084 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="0fb3a256512499ae16934c9ba9ee1a59382e5cd9e65f810fb30d76fbdefbf701" exitCode=0 Dec 09 04:54:37 crc kubenswrapper[4766]: I1209 04:54:37.868700 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"0fb3a256512499ae16934c9ba9ee1a59382e5cd9e65f810fb30d76fbdefbf701"} Dec 09 04:54:37 crc kubenswrapper[4766]: I1209 04:54:37.868738 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b"} Dec 09 04:54:37 crc kubenswrapper[4766]: I1209 04:54:37.868774 4766 scope.go:117] "RemoveContainer" containerID="3c6c66c4477117b0a8739c0f966f90699d07530112c19d625dc1c9270f886f2d" Dec 09 04:54:54 crc kubenswrapper[4766]: I1209 04:54:54.791569 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j6rd9"] Dec 09 04:54:54 crc kubenswrapper[4766]: I1209 04:54:54.794923 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:54 crc kubenswrapper[4766]: I1209 04:54:54.806394 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6rd9"] Dec 09 04:54:54 crc kubenswrapper[4766]: I1209 04:54:54.906088 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-catalog-content\") pod \"redhat-operators-j6rd9\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:54 crc kubenswrapper[4766]: I1209 04:54:54.906628 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98ggh\" (UniqueName: \"kubernetes.io/projected/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-kube-api-access-98ggh\") pod \"redhat-operators-j6rd9\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:54 crc kubenswrapper[4766]: I1209 04:54:54.906821 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-utilities\") pod \"redhat-operators-j6rd9\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:55 crc kubenswrapper[4766]: I1209 04:54:55.009034 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98ggh\" (UniqueName: \"kubernetes.io/projected/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-kube-api-access-98ggh\") pod \"redhat-operators-j6rd9\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:55 crc kubenswrapper[4766]: I1209 04:54:55.009108 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-utilities\") pod \"redhat-operators-j6rd9\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:55 crc kubenswrapper[4766]: I1209 04:54:55.009259 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-catalog-content\") pod \"redhat-operators-j6rd9\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:55 crc kubenswrapper[4766]: I1209 04:54:55.009706 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-catalog-content\") pod \"redhat-operators-j6rd9\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:55 crc kubenswrapper[4766]: I1209 04:54:55.010014 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-utilities\") pod \"redhat-operators-j6rd9\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:55 crc kubenswrapper[4766]: I1209 04:54:55.032652 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98ggh\" (UniqueName: \"kubernetes.io/projected/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-kube-api-access-98ggh\") pod \"redhat-operators-j6rd9\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:55 crc kubenswrapper[4766]: I1209 04:54:55.140389 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:54:55 crc kubenswrapper[4766]: I1209 04:54:55.639553 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6rd9"] Dec 09 04:54:56 crc kubenswrapper[4766]: I1209 04:54:56.072475 4766 generic.go:334] "Generic (PLEG): container finished" podID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerID="9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db" exitCode=0 Dec 09 04:54:56 crc kubenswrapper[4766]: I1209 04:54:56.072625 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6rd9" event={"ID":"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a","Type":"ContainerDied","Data":"9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db"} Dec 09 04:54:56 crc kubenswrapper[4766]: I1209 04:54:56.073146 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6rd9" event={"ID":"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a","Type":"ContainerStarted","Data":"92f8dab57da9d02a50756ea62c1af4432ee3298b0f9c61fc43057eb622a7aeee"} Dec 09 04:54:58 crc kubenswrapper[4766]: I1209 04:54:58.096982 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6rd9" event={"ID":"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a","Type":"ContainerStarted","Data":"053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b"} Dec 09 04:55:00 crc kubenswrapper[4766]: I1209 04:55:00.120878 4766 generic.go:334] "Generic (PLEG): container finished" podID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerID="053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b" exitCode=0 Dec 09 04:55:00 crc kubenswrapper[4766]: I1209 04:55:00.121197 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6rd9" event={"ID":"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a","Type":"ContainerDied","Data":"053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b"} Dec 09 04:55:01 crc kubenswrapper[4766]: I1209 04:55:01.133508 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6rd9" event={"ID":"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a","Type":"ContainerStarted","Data":"ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb"} Dec 09 04:55:01 crc kubenswrapper[4766]: I1209 04:55:01.162339 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j6rd9" podStartSLOduration=2.718411665 podStartE2EDuration="7.162315723s" podCreationTimestamp="2025-12-09 04:54:54 +0000 UTC" firstStartedPulling="2025-12-09 04:54:56.074852342 +0000 UTC m=+6177.784157768" lastFinishedPulling="2025-12-09 04:55:00.5187564 +0000 UTC m=+6182.228061826" observedRunningTime="2025-12-09 04:55:01.152123057 +0000 UTC m=+6182.861428523" watchObservedRunningTime="2025-12-09 04:55:01.162315723 +0000 UTC m=+6182.871621149" Dec 09 04:55:05 crc kubenswrapper[4766]: I1209 04:55:05.141470 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:55:05 crc kubenswrapper[4766]: I1209 04:55:05.142008 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:55:06 crc kubenswrapper[4766]: I1209 04:55:06.214231 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j6rd9" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerName="registry-server" probeResult="failure" output=< Dec 09 04:55:06 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 04:55:06 crc kubenswrapper[4766]: > Dec 09 04:55:07 crc kubenswrapper[4766]: I1209 04:55:07.049043 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4tj69"] Dec 09 04:55:07 crc kubenswrapper[4766]: I1209 04:55:07.060629 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6f65-account-create-update-5gg5r"] Dec 09 04:55:07 crc kubenswrapper[4766]: I1209 04:55:07.073841 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4tj69"] Dec 09 04:55:07 crc kubenswrapper[4766]: I1209 04:55:07.085954 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6f65-account-create-update-5gg5r"] Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.433156 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75d7cb49fc-4rvvd"] Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.435956 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.441227 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.442048 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-rzqfr" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.442142 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.443945 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.448798 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75d7cb49fc-4rvvd"] Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.503322 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.503596 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-log" containerID="cri-o://feb162cbf9533d45e263da4954c3f02a63b71ddfaa4d43d8f64549dd37d305e9" gracePeriod=30 Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.503719 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-httpd" containerID="cri-o://2fdebba462743f7669b38b4292df1c429a8c0470918a04c8b76e88b1cff1e4c2" gracePeriod=30 Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.560202 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-69d89bf46f-gtqkg"] Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.563059 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.570242 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69d89bf46f-gtqkg"] Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.590147 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-scripts\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.590203 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-horizon-secret-key\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.590380 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-config-data\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.590408 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-logs\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.590434 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqmvw\" (UniqueName: \"kubernetes.io/projected/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-kube-api-access-hqmvw\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.616851 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.617133 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-log" containerID="cri-o://64e3d278b9e3fbd760a419c198ca16e7fe4c0ea743181c33fcb6e1c33cc44a00" gracePeriod=30 Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.617182 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-httpd" containerID="cri-o://2f369150b7ee568cdfe54ed4396c15d5014c11103a0b1a2645fb7f8910e33faf" gracePeriod=30 Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692066 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-scripts\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692119 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfcmx\" (UniqueName: \"kubernetes.io/projected/dcb98041-c67b-46a7-b16b-74eeebbee361-kube-api-access-jfcmx\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692143 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-horizon-secret-key\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692186 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcb98041-c67b-46a7-b16b-74eeebbee361-horizon-secret-key\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692293 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-scripts\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692334 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb98041-c67b-46a7-b16b-74eeebbee361-logs\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692363 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-config-data\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692422 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-config-data\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692450 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-logs\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692512 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqmvw\" (UniqueName: \"kubernetes.io/projected/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-kube-api-access-hqmvw\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.692956 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-logs\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.693686 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-config-data\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.694034 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-scripts\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.697893 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-horizon-secret-key\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.707623 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqmvw\" (UniqueName: \"kubernetes.io/projected/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-kube-api-access-hqmvw\") pod \"horizon-75d7cb49fc-4rvvd\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.761700 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.794555 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfcmx\" (UniqueName: \"kubernetes.io/projected/dcb98041-c67b-46a7-b16b-74eeebbee361-kube-api-access-jfcmx\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.794631 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcb98041-c67b-46a7-b16b-74eeebbee361-horizon-secret-key\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.794686 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-scripts\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.794725 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb98041-c67b-46a7-b16b-74eeebbee361-logs\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.794751 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-config-data\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.795196 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb98041-c67b-46a7-b16b-74eeebbee361-logs\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.795520 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-scripts\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.796063 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-config-data\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.797488 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcb98041-c67b-46a7-b16b-74eeebbee361-horizon-secret-key\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.812834 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfcmx\" (UniqueName: \"kubernetes.io/projected/dcb98041-c67b-46a7-b16b-74eeebbee361-kube-api-access-jfcmx\") pod \"horizon-69d89bf46f-gtqkg\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.858018 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aea6daf-6c69-4bac-aea8-5d946702ceb3" path="/var/lib/kubelet/pods/8aea6daf-6c69-4bac-aea8-5d946702ceb3/volumes" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.859102 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="920faed1-036c-4f9e-9bd0-758f76a7e4d9" path="/var/lib/kubelet/pods/920faed1-036c-4f9e-9bd0-758f76a7e4d9/volumes" Dec 09 04:55:08 crc kubenswrapper[4766]: I1209 04:55:08.883579 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.211326 4766 generic.go:334] "Generic (PLEG): container finished" podID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerID="64e3d278b9e3fbd760a419c198ca16e7fe4c0ea743181c33fcb6e1c33cc44a00" exitCode=143 Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.211369 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91eb833a-69ae-4fee-bf91-984f74e2291f","Type":"ContainerDied","Data":"64e3d278b9e3fbd760a419c198ca16e7fe4c0ea743181c33fcb6e1c33cc44a00"} Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.218470 4766 generic.go:334] "Generic (PLEG): container finished" podID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerID="feb162cbf9533d45e263da4954c3f02a63b71ddfaa4d43d8f64549dd37d305e9" exitCode=143 Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.218524 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8","Type":"ContainerDied","Data":"feb162cbf9533d45e263da4954c3f02a63b71ddfaa4d43d8f64549dd37d305e9"} Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.251433 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75d7cb49fc-4rvvd"] Dec 09 04:55:09 crc kubenswrapper[4766]: W1209 04:55:09.256514 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f41119_8dff_4af1_9e94_b9ae91eb4f9d.slice/crio-b2e758a9a7cc43e0a9957b56f7d45e6b1575b39963b766c0310da4ac093895de WatchSource:0}: Error finding container b2e758a9a7cc43e0a9957b56f7d45e6b1575b39963b766c0310da4ac093895de: Status 404 returned error can't find the container with id b2e758a9a7cc43e0a9957b56f7d45e6b1575b39963b766c0310da4ac093895de Dec 09 04:55:09 crc kubenswrapper[4766]: W1209 04:55:09.368533 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcb98041_c67b_46a7_b16b_74eeebbee361.slice/crio-183a5cfe694e1ad05020a9ad67c4f717773a67441572c56323d359a3712c0bb0 WatchSource:0}: Error finding container 183a5cfe694e1ad05020a9ad67c4f717773a67441572c56323d359a3712c0bb0: Status 404 returned error can't find the container with id 183a5cfe694e1ad05020a9ad67c4f717773a67441572c56323d359a3712c0bb0 Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.369249 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69d89bf46f-gtqkg"] Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.651750 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69d89bf46f-gtqkg"] Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.698865 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d55579bb5-sbh89"] Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.700963 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.708362 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d55579bb5-sbh89"] Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.816584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxnnd\" (UniqueName: \"kubernetes.io/projected/e188b94d-17b3-4d2a-b03d-ef25ad951471-kube-api-access-pxnnd\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.816629 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-config-data\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.816669 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e188b94d-17b3-4d2a-b03d-ef25ad951471-logs\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.816746 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e188b94d-17b3-4d2a-b03d-ef25ad951471-horizon-secret-key\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.816808 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-scripts\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.918668 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxnnd\" (UniqueName: \"kubernetes.io/projected/e188b94d-17b3-4d2a-b03d-ef25ad951471-kube-api-access-pxnnd\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.918719 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-config-data\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.918770 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e188b94d-17b3-4d2a-b03d-ef25ad951471-logs\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.918865 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e188b94d-17b3-4d2a-b03d-ef25ad951471-horizon-secret-key\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.918938 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-scripts\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.920077 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-config-data\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.920596 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e188b94d-17b3-4d2a-b03d-ef25ad951471-logs\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.921057 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-scripts\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.933855 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e188b94d-17b3-4d2a-b03d-ef25ad951471-horizon-secret-key\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:09 crc kubenswrapper[4766]: I1209 04:55:09.937066 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxnnd\" (UniqueName: \"kubernetes.io/projected/e188b94d-17b3-4d2a-b03d-ef25ad951471-kube-api-access-pxnnd\") pod \"horizon-6d55579bb5-sbh89\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:10 crc kubenswrapper[4766]: I1209 04:55:10.031351 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:10 crc kubenswrapper[4766]: I1209 04:55:10.261335 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d89bf46f-gtqkg" event={"ID":"dcb98041-c67b-46a7-b16b-74eeebbee361","Type":"ContainerStarted","Data":"183a5cfe694e1ad05020a9ad67c4f717773a67441572c56323d359a3712c0bb0"} Dec 09 04:55:10 crc kubenswrapper[4766]: I1209 04:55:10.291368 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75d7cb49fc-4rvvd" event={"ID":"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d","Type":"ContainerStarted","Data":"b2e758a9a7cc43e0a9957b56f7d45e6b1575b39963b766c0310da4ac093895de"} Dec 09 04:55:10 crc kubenswrapper[4766]: I1209 04:55:10.381661 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d55579bb5-sbh89"] Dec 09 04:55:10 crc kubenswrapper[4766]: W1209 04:55:10.395717 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode188b94d_17b3_4d2a_b03d_ef25ad951471.slice/crio-c3c6bec4ffafbb3c686e5a08ad20ad791c0e48620c09d65a91e93f5d5cecd376 WatchSource:0}: Error finding container c3c6bec4ffafbb3c686e5a08ad20ad791c0e48620c09d65a91e93f5d5cecd376: Status 404 returned error can't find the container with id c3c6bec4ffafbb3c686e5a08ad20ad791c0e48620c09d65a91e93f5d5cecd376 Dec 09 04:55:11 crc kubenswrapper[4766]: I1209 04:55:11.303965 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d55579bb5-sbh89" event={"ID":"e188b94d-17b3-4d2a-b03d-ef25ad951471","Type":"ContainerStarted","Data":"c3c6bec4ffafbb3c686e5a08ad20ad791c0e48620c09d65a91e93f5d5cecd376"} Dec 09 04:55:12 crc kubenswrapper[4766]: I1209 04:55:12.318309 4766 generic.go:334] "Generic (PLEG): container finished" podID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerID="2fdebba462743f7669b38b4292df1c429a8c0470918a04c8b76e88b1cff1e4c2" exitCode=0 Dec 09 04:55:12 crc kubenswrapper[4766]: I1209 04:55:12.318353 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8","Type":"ContainerDied","Data":"2fdebba462743f7669b38b4292df1c429a8c0470918a04c8b76e88b1cff1e4c2"} Dec 09 04:55:12 crc kubenswrapper[4766]: I1209 04:55:12.323950 4766 generic.go:334] "Generic (PLEG): container finished" podID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerID="2f369150b7ee568cdfe54ed4396c15d5014c11103a0b1a2645fb7f8910e33faf" exitCode=0 Dec 09 04:55:12 crc kubenswrapper[4766]: I1209 04:55:12.323977 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91eb833a-69ae-4fee-bf91-984f74e2291f","Type":"ContainerDied","Data":"2f369150b7ee568cdfe54ed4396c15d5014c11103a0b1a2645fb7f8910e33faf"} Dec 09 04:55:13 crc kubenswrapper[4766]: I1209 04:55:13.035386 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6pxf5"] Dec 09 04:55:13 crc kubenswrapper[4766]: I1209 04:55:13.046224 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6pxf5"] Dec 09 04:55:14 crc kubenswrapper[4766]: I1209 04:55:14.864128 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a388e942-d085-4f67-8f6c-c04eedc01a84" path="/var/lib/kubelet/pods/a388e942-d085-4f67-8f6c-c04eedc01a84/volumes" Dec 09 04:55:15 crc kubenswrapper[4766]: I1209 04:55:15.193561 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:55:15 crc kubenswrapper[4766]: I1209 04:55:15.255444 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:55:15 crc kubenswrapper[4766]: I1209 04:55:15.361939 4766 scope.go:117] "RemoveContainer" containerID="dcafac24941a5badba1868ee31abe3f0876f1677ee59545d93b1da4708b145ef" Dec 09 04:55:15 crc kubenswrapper[4766]: I1209 04:55:15.435060 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6rd9"] Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.241541 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.42:9292/healthcheck\": dial tcp 10.217.1.42:9292: connect: connection refused" Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.241646 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.42:9292/healthcheck\": dial tcp 10.217.1.42:9292: connect: connection refused" Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.364628 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j6rd9" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerName="registry-server" containerID="cri-o://ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb" gracePeriod=2 Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.450838 4766 scope.go:117] "RemoveContainer" containerID="24c523662388fabd2a0f22feb8e8fae84c2e10dab3bc8b1591d566ba131e094e" Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.683735 4766 scope.go:117] "RemoveContainer" containerID="85a1468ad69504b403522a61ef2fa122d3a4e2f277511a377d0384e6df158599" Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.881546 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.990640 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-catalog-content\") pod \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.990722 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-utilities\") pod \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.990753 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98ggh\" (UniqueName: \"kubernetes.io/projected/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-kube-api-access-98ggh\") pod \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\" (UID: \"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a\") " Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.991780 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-utilities" (OuterVolumeSpecName: "utilities") pod "d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" (UID: "d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:55:16 crc kubenswrapper[4766]: I1209 04:55:16.996977 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-kube-api-access-98ggh" (OuterVolumeSpecName: "kube-api-access-98ggh") pod "d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" (UID: "d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a"). InnerVolumeSpecName "kube-api-access-98ggh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.086445 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.092963 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.092991 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98ggh\" (UniqueName: \"kubernetes.io/projected/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-kube-api-access-98ggh\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.139119 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" (UID: "d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.194592 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-scripts\") pod \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.194637 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-config-data\") pod \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.194708 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-ceph\") pod \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.194760 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-combined-ca-bundle\") pod \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.194786 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-httpd-run\") pod \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.194825 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-logs\") pod \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.194868 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw74x\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-kube-api-access-xw74x\") pod \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\" (UID: \"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8\") " Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.195128 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" (UID: "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.195271 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.195285 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.195902 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-logs" (OuterVolumeSpecName: "logs") pod "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" (UID: "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.200761 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-scripts" (OuterVolumeSpecName: "scripts") pod "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" (UID: "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.200995 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-ceph" (OuterVolumeSpecName: "ceph") pod "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" (UID: "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.205433 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-kube-api-access-xw74x" (OuterVolumeSpecName: "kube-api-access-xw74x") pod "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" (UID: "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8"). InnerVolumeSpecName "kube-api-access-xw74x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.224473 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" (UID: "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.292152 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-config-data" (OuterVolumeSpecName: "config-data") pod "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" (UID: "7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.296883 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.296906 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.296916 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.296924 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.296932 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.296945 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw74x\" (UniqueName: \"kubernetes.io/projected/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8-kube-api-access-xw74x\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.379148 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d55579bb5-sbh89" event={"ID":"e188b94d-17b3-4d2a-b03d-ef25ad951471","Type":"ContainerStarted","Data":"b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0"} Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.379513 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d55579bb5-sbh89" event={"ID":"e188b94d-17b3-4d2a-b03d-ef25ad951471","Type":"ContainerStarted","Data":"72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896"} Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.381134 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8","Type":"ContainerDied","Data":"cfdaa7744aa17718c98b2835033513c500b53353387fb5801b21a1a3763e28eb"} Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.381189 4766 scope.go:117] "RemoveContainer" containerID="2fdebba462743f7669b38b4292df1c429a8c0470918a04c8b76e88b1cff1e4c2" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.381346 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.386498 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75d7cb49fc-4rvvd" event={"ID":"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d","Type":"ContainerStarted","Data":"52d469056cb01204be65e8718d26cea128b97bd3a1227c05c07d9dc7b7882311"} Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.386542 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75d7cb49fc-4rvvd" event={"ID":"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d","Type":"ContainerStarted","Data":"9bd107bc516b0734d0998795d29c161752cf3343b49d1ea8e7e5e0a090d8e3b3"} Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.391060 4766 generic.go:334] "Generic (PLEG): container finished" podID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerID="ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb" exitCode=0 Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.391162 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6rd9" event={"ID":"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a","Type":"ContainerDied","Data":"ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb"} Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.391257 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6rd9" event={"ID":"d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a","Type":"ContainerDied","Data":"92f8dab57da9d02a50756ea62c1af4432ee3298b0f9c61fc43057eb622a7aeee"} Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.391432 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6rd9" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.396282 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d89bf46f-gtqkg" event={"ID":"dcb98041-c67b-46a7-b16b-74eeebbee361","Type":"ContainerStarted","Data":"d193497fad6ecc979ed3ecb60088f51cc34576887bd67816a886011691eebf10"} Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.396326 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d89bf46f-gtqkg" event={"ID":"dcb98041-c67b-46a7-b16b-74eeebbee361","Type":"ContainerStarted","Data":"20007cb3bd525f6fb6b4c1b898c82340c508c9f7df88df0cd2e5c6192292d036"} Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.396419 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-69d89bf46f-gtqkg" podUID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerName="horizon-log" containerID="cri-o://20007cb3bd525f6fb6b4c1b898c82340c508c9f7df88df0cd2e5c6192292d036" gracePeriod=30 Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.396538 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-69d89bf46f-gtqkg" podUID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerName="horizon" containerID="cri-o://d193497fad6ecc979ed3ecb60088f51cc34576887bd67816a886011691eebf10" gracePeriod=30 Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.412019 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d55579bb5-sbh89" podStartSLOduration=2.126042366 podStartE2EDuration="8.411996967s" podCreationTimestamp="2025-12-09 04:55:09 +0000 UTC" firstStartedPulling="2025-12-09 04:55:10.398199604 +0000 UTC m=+6192.107505030" lastFinishedPulling="2025-12-09 04:55:16.684154185 +0000 UTC m=+6198.393459631" observedRunningTime="2025-12-09 04:55:17.403483767 +0000 UTC m=+6199.112789203" watchObservedRunningTime="2025-12-09 04:55:17.411996967 +0000 UTC m=+6199.121302393" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.438853 4766 scope.go:117] "RemoveContainer" containerID="feb162cbf9533d45e263da4954c3f02a63b71ddfaa4d43d8f64549dd37d305e9" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.459751 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-69d89bf46f-gtqkg" podStartSLOduration=2.146866758 podStartE2EDuration="9.459729918s" podCreationTimestamp="2025-12-09 04:55:08 +0000 UTC" firstStartedPulling="2025-12-09 04:55:09.370959656 +0000 UTC m=+6191.080265082" lastFinishedPulling="2025-12-09 04:55:16.683822816 +0000 UTC m=+6198.393128242" observedRunningTime="2025-12-09 04:55:17.431651159 +0000 UTC m=+6199.140956595" watchObservedRunningTime="2025-12-09 04:55:17.459729918 +0000 UTC m=+6199.169035344" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.475900 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-75d7cb49fc-4rvvd" podStartSLOduration=2.053319427 podStartE2EDuration="9.475878704s" podCreationTimestamp="2025-12-09 04:55:08 +0000 UTC" firstStartedPulling="2025-12-09 04:55:09.261254869 +0000 UTC m=+6190.970560295" lastFinishedPulling="2025-12-09 04:55:16.683814146 +0000 UTC m=+6198.393119572" observedRunningTime="2025-12-09 04:55:17.465515234 +0000 UTC m=+6199.174820660" watchObservedRunningTime="2025-12-09 04:55:17.475878704 +0000 UTC m=+6199.185184130" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.508600 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.531597 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.545789 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:55:17 crc kubenswrapper[4766]: E1209 04:55:17.546238 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerName="extract-content" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.546254 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerName="extract-content" Dec 09 04:55:17 crc kubenswrapper[4766]: E1209 04:55:17.546283 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerName="registry-server" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.546289 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerName="registry-server" Dec 09 04:55:17 crc kubenswrapper[4766]: E1209 04:55:17.546297 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-httpd" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.546303 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-httpd" Dec 09 04:55:17 crc kubenswrapper[4766]: E1209 04:55:17.546315 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-log" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.546321 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-log" Dec 09 04:55:17 crc kubenswrapper[4766]: E1209 04:55:17.546334 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerName="extract-utilities" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.546341 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerName="extract-utilities" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.546550 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-log" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.546564 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" containerName="glance-httpd" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.546581 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" containerName="registry-server" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.546901 4766 scope.go:117] "RemoveContainer" containerID="ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.547597 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.550157 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.563602 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.582915 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6rd9"] Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.586495 4766 scope.go:117] "RemoveContainer" containerID="053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.605408 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b4085a1-eb3c-453f-a519-9901c33de716-scripts\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.605463 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngb9m\" (UniqueName: \"kubernetes.io/projected/5b4085a1-eb3c-453f-a519-9901c33de716-kube-api-access-ngb9m\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.605504 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b4085a1-eb3c-453f-a519-9901c33de716-config-data\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.605540 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b4085a1-eb3c-453f-a519-9901c33de716-ceph\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.605592 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b4085a1-eb3c-453f-a519-9901c33de716-logs\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.605670 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b4085a1-eb3c-453f-a519-9901c33de716-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.605715 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b4085a1-eb3c-453f-a519-9901c33de716-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.605833 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j6rd9"] Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.638182 4766 scope.go:117] "RemoveContainer" containerID="9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.688273 4766 scope.go:117] "RemoveContainer" containerID="ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb" Dec 09 04:55:17 crc kubenswrapper[4766]: E1209 04:55:17.688752 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb\": container with ID starting with ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb not found: ID does not exist" containerID="ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.688785 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb"} err="failed to get container status \"ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb\": rpc error: code = NotFound desc = could not find container \"ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb\": container with ID starting with ca79e8f46430e9dc1c733fb82eac6eaa7ab2d546dd96df4590a2595c160862fb not found: ID does not exist" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.688805 4766 scope.go:117] "RemoveContainer" containerID="053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b" Dec 09 04:55:17 crc kubenswrapper[4766]: E1209 04:55:17.689326 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b\": container with ID starting with 053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b not found: ID does not exist" containerID="053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.689369 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b"} err="failed to get container status \"053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b\": rpc error: code = NotFound desc = could not find container \"053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b\": container with ID starting with 053655efeb960ec915501d82f024bffa2a5111654a1ad8ccde119e3b056c944b not found: ID does not exist" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.689399 4766 scope.go:117] "RemoveContainer" containerID="9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db" Dec 09 04:55:17 crc kubenswrapper[4766]: E1209 04:55:17.689727 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db\": container with ID starting with 9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db not found: ID does not exist" containerID="9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.689755 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db"} err="failed to get container status \"9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db\": rpc error: code = NotFound desc = could not find container \"9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db\": container with ID starting with 9b3384f2c2ff32c8ab42ae1cd2d14bf5d6719f49f012f7ef68149fb1840e07db not found: ID does not exist" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.707032 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b4085a1-eb3c-453f-a519-9901c33de716-scripts\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.707110 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngb9m\" (UniqueName: \"kubernetes.io/projected/5b4085a1-eb3c-453f-a519-9901c33de716-kube-api-access-ngb9m\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.707173 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b4085a1-eb3c-453f-a519-9901c33de716-config-data\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.707238 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b4085a1-eb3c-453f-a519-9901c33de716-ceph\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.707300 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b4085a1-eb3c-453f-a519-9901c33de716-logs\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.707384 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b4085a1-eb3c-453f-a519-9901c33de716-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.707427 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b4085a1-eb3c-453f-a519-9901c33de716-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.707953 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b4085a1-eb3c-453f-a519-9901c33de716-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.708376 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b4085a1-eb3c-453f-a519-9901c33de716-logs\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.712367 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b4085a1-eb3c-453f-a519-9901c33de716-scripts\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.712828 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b4085a1-eb3c-453f-a519-9901c33de716-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.713902 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5b4085a1-eb3c-453f-a519-9901c33de716-ceph\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.714808 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b4085a1-eb3c-453f-a519-9901c33de716-config-data\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.731720 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngb9m\" (UniqueName: \"kubernetes.io/projected/5b4085a1-eb3c-453f-a519-9901c33de716-kube-api-access-ngb9m\") pod \"glance-default-external-api-0\" (UID: \"5b4085a1-eb3c-453f-a519-9901c33de716\") " pod="openstack/glance-default-external-api-0" Dec 09 04:55:17 crc kubenswrapper[4766]: E1209 04:55:17.750071 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4771a9c_b401_4e55_89aa_e8c2f5cb6b1a.slice/crio-92f8dab57da9d02a50756ea62c1af4432ee3298b0f9c61fc43057eb622a7aeee\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4771a9c_b401_4e55_89aa_e8c2f5cb6b1a.slice\": RecentStats: unable to find data in memory cache]" Dec 09 04:55:17 crc kubenswrapper[4766]: I1209 04:55:17.916565 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.120067 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.222588 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-config-data\") pod \"91eb833a-69ae-4fee-bf91-984f74e2291f\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.222708 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-combined-ca-bundle\") pod \"91eb833a-69ae-4fee-bf91-984f74e2291f\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.222785 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-scripts\") pod \"91eb833a-69ae-4fee-bf91-984f74e2291f\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.222809 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-logs\") pod \"91eb833a-69ae-4fee-bf91-984f74e2291f\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.222906 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-httpd-run\") pod \"91eb833a-69ae-4fee-bf91-984f74e2291f\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.222952 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-ceph\") pod \"91eb833a-69ae-4fee-bf91-984f74e2291f\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.222977 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njh6m\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-kube-api-access-njh6m\") pod \"91eb833a-69ae-4fee-bf91-984f74e2291f\" (UID: \"91eb833a-69ae-4fee-bf91-984f74e2291f\") " Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.225406 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "91eb833a-69ae-4fee-bf91-984f74e2291f" (UID: "91eb833a-69ae-4fee-bf91-984f74e2291f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.225596 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-logs" (OuterVolumeSpecName: "logs") pod "91eb833a-69ae-4fee-bf91-984f74e2291f" (UID: "91eb833a-69ae-4fee-bf91-984f74e2291f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.231029 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-scripts" (OuterVolumeSpecName: "scripts") pod "91eb833a-69ae-4fee-bf91-984f74e2291f" (UID: "91eb833a-69ae-4fee-bf91-984f74e2291f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.235344 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-ceph" (OuterVolumeSpecName: "ceph") pod "91eb833a-69ae-4fee-bf91-984f74e2291f" (UID: "91eb833a-69ae-4fee-bf91-984f74e2291f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.235377 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-kube-api-access-njh6m" (OuterVolumeSpecName: "kube-api-access-njh6m") pod "91eb833a-69ae-4fee-bf91-984f74e2291f" (UID: "91eb833a-69ae-4fee-bf91-984f74e2291f"). InnerVolumeSpecName "kube-api-access-njh6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.262701 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91eb833a-69ae-4fee-bf91-984f74e2291f" (UID: "91eb833a-69ae-4fee-bf91-984f74e2291f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.299838 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-config-data" (OuterVolumeSpecName: "config-data") pod "91eb833a-69ae-4fee-bf91-984f74e2291f" (UID: "91eb833a-69ae-4fee-bf91-984f74e2291f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.327435 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.327506 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.327534 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/91eb833a-69ae-4fee-bf91-984f74e2291f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.327545 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.327554 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njh6m\" (UniqueName: \"kubernetes.io/projected/91eb833a-69ae-4fee-bf91-984f74e2291f-kube-api-access-njh6m\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.327564 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.327572 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91eb833a-69ae-4fee-bf91-984f74e2291f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.410973 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"91eb833a-69ae-4fee-bf91-984f74e2291f","Type":"ContainerDied","Data":"a841dbf0a094ef80ffd5890868432f0ce330df2a45cf26dc5e7215ef39104915"} Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.411050 4766 scope.go:117] "RemoveContainer" containerID="2f369150b7ee568cdfe54ed4396c15d5014c11103a0b1a2645fb7f8910e33faf" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.411424 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.515437 4766 scope.go:117] "RemoveContainer" containerID="64e3d278b9e3fbd760a419c198ca16e7fe4c0ea743181c33fcb6e1c33cc44a00" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.524454 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.545469 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.565279 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:55:18 crc kubenswrapper[4766]: E1209 04:55:18.565802 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-log" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.565818 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-log" Dec 09 04:55:18 crc kubenswrapper[4766]: E1209 04:55:18.565848 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-httpd" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.565856 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-httpd" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.566109 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-log" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.566134 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-httpd" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.567286 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.570478 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.584893 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.619082 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.633564 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.633674 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.633700 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.633737 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.633805 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.633837 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z22c7\" (UniqueName: \"kubernetes.io/projected/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-kube-api-access-z22c7\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.633900 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.735617 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.735727 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.735750 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.735784 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.735843 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.735875 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z22c7\" (UniqueName: \"kubernetes.io/projected/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-kube-api-access-z22c7\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.735936 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.738683 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.739139 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.740983 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.741081 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.743391 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.750410 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.757267 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z22c7\" (UniqueName: \"kubernetes.io/projected/e9d2658e-5ee6-4962-81c3-5e7406d32b6f-kube-api-access-z22c7\") pod \"glance-default-internal-api-0\" (UID: \"e9d2658e-5ee6-4962-81c3-5e7406d32b6f\") " pod="openstack/glance-default-internal-api-0" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.762163 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.762304 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.864596 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8" path="/var/lib/kubelet/pods/7a1cc0ec-e95d-4e44-897f-cc1af0d94cc8/volumes" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.865531 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" path="/var/lib/kubelet/pods/91eb833a-69ae-4fee-bf91-984f74e2291f/volumes" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.866285 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a" path="/var/lib/kubelet/pods/d4771a9c-b401-4e55-89aa-e8c2f5cb6b1a/volumes" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.885006 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:18 crc kubenswrapper[4766]: I1209 04:55:18.893686 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:19 crc kubenswrapper[4766]: I1209 04:55:19.420259 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b4085a1-eb3c-453f-a519-9901c33de716","Type":"ContainerStarted","Data":"5eb30c9549b5be4eee4cf179fee2bbf05bf21f45a1e571114230022d729199d0"} Dec 09 04:55:19 crc kubenswrapper[4766]: I1209 04:55:19.540596 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 09 04:55:20 crc kubenswrapper[4766]: I1209 04:55:20.033376 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:20 crc kubenswrapper[4766]: I1209 04:55:20.036683 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:20 crc kubenswrapper[4766]: I1209 04:55:20.440767 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9d2658e-5ee6-4962-81c3-5e7406d32b6f","Type":"ContainerStarted","Data":"53a3b3c8f60c852125370f996495649d4dc458fe736f0d0c743557fac79fcfe7"} Dec 09 04:55:20 crc kubenswrapper[4766]: I1209 04:55:20.441012 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9d2658e-5ee6-4962-81c3-5e7406d32b6f","Type":"ContainerStarted","Data":"99a8677f29baf48d81e0284f0c5e1cb0fcee1ad003aa1c887ac09ce4d64d7b44"} Dec 09 04:55:20 crc kubenswrapper[4766]: I1209 04:55:20.444186 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b4085a1-eb3c-453f-a519-9901c33de716","Type":"ContainerStarted","Data":"e520caec9d90a2dea5dfb690f656e273db11b4c57c747452e5a1759de266029a"} Dec 09 04:55:20 crc kubenswrapper[4766]: I1209 04:55:20.444239 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5b4085a1-eb3c-453f-a519-9901c33de716","Type":"ContainerStarted","Data":"d82d304cd53af97674b0f3b1f09278b8d5d6094cf4a0b5a468d52f3aa2cd0701"} Dec 09 04:55:20 crc kubenswrapper[4766]: I1209 04:55:20.470668 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.470646427 podStartE2EDuration="3.470646427s" podCreationTimestamp="2025-12-09 04:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:55:20.464713376 +0000 UTC m=+6202.174018802" watchObservedRunningTime="2025-12-09 04:55:20.470646427 +0000 UTC m=+6202.179951853" Dec 09 04:55:21 crc kubenswrapper[4766]: I1209 04:55:21.457511 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9d2658e-5ee6-4962-81c3-5e7406d32b6f","Type":"ContainerStarted","Data":"11d782d2a03a894735d8bdf99364350691233d2cd1bd4794a0ce9145f1c61682"} Dec 09 04:55:21 crc kubenswrapper[4766]: I1209 04:55:21.489701 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.489661163 podStartE2EDuration="3.489661163s" podCreationTimestamp="2025-12-09 04:55:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:55:21.481201264 +0000 UTC m=+6203.190506700" watchObservedRunningTime="2025-12-09 04:55:21.489661163 +0000 UTC m=+6203.198966609" Dec 09 04:55:27 crc kubenswrapper[4766]: I1209 04:55:27.917089 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 04:55:27 crc kubenswrapper[4766]: I1209 04:55:27.917542 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 09 04:55:27 crc kubenswrapper[4766]: I1209 04:55:27.961821 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 04:55:28 crc kubenswrapper[4766]: I1209 04:55:27.995509 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 09 04:55:28 crc kubenswrapper[4766]: I1209 04:55:28.534823 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 04:55:28 crc kubenswrapper[4766]: I1209 04:55:28.534877 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 09 04:55:28 crc kubenswrapper[4766]: I1209 04:55:28.765967 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75d7cb49fc-4rvvd" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Dec 09 04:55:28 crc kubenswrapper[4766]: I1209 04:55:28.894034 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:28 crc kubenswrapper[4766]: I1209 04:55:28.894086 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:28 crc kubenswrapper[4766]: I1209 04:55:28.938694 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:28 crc kubenswrapper[4766]: I1209 04:55:28.957055 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:29 crc kubenswrapper[4766]: I1209 04:55:29.549725 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:29 crc kubenswrapper[4766]: I1209 04:55:29.549805 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:30 crc kubenswrapper[4766]: I1209 04:55:30.033391 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d55579bb5-sbh89" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Dec 09 04:55:30 crc kubenswrapper[4766]: I1209 04:55:30.506627 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 04:55:30 crc kubenswrapper[4766]: I1209 04:55:30.551281 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 09 04:55:31 crc kubenswrapper[4766]: I1209 04:55:31.610400 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:31 crc kubenswrapper[4766]: I1209 04:55:31.610929 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 04:55:31 crc kubenswrapper[4766]: I1209 04:55:31.734675 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.482747 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-44hpc"] Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.485534 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.494592 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44hpc"] Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.552266 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-utilities\") pod \"certified-operators-44hpc\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.552738 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-catalog-content\") pod \"certified-operators-44hpc\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.553081 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jgjh\" (UniqueName: \"kubernetes.io/projected/65dc7556-90c1-49fa-9305-4e5c2fb52381-kube-api-access-2jgjh\") pod \"certified-operators-44hpc\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.655676 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-catalog-content\") pod \"certified-operators-44hpc\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.655829 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jgjh\" (UniqueName: \"kubernetes.io/projected/65dc7556-90c1-49fa-9305-4e5c2fb52381-kube-api-access-2jgjh\") pod \"certified-operators-44hpc\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.655931 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-utilities\") pod \"certified-operators-44hpc\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.656479 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-catalog-content\") pod \"certified-operators-44hpc\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.656520 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-utilities\") pod \"certified-operators-44hpc\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.679518 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jgjh\" (UniqueName: \"kubernetes.io/projected/65dc7556-90c1-49fa-9305-4e5c2fb52381-kube-api-access-2jgjh\") pod \"certified-operators-44hpc\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:35 crc kubenswrapper[4766]: I1209 04:55:35.816605 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:36 crc kubenswrapper[4766]: I1209 04:55:36.271822 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44hpc"] Dec 09 04:55:36 crc kubenswrapper[4766]: I1209 04:55:36.627908 4766 generic.go:334] "Generic (PLEG): container finished" podID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerID="a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076" exitCode=0 Dec 09 04:55:36 crc kubenswrapper[4766]: I1209 04:55:36.627949 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hpc" event={"ID":"65dc7556-90c1-49fa-9305-4e5c2fb52381","Type":"ContainerDied","Data":"a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076"} Dec 09 04:55:36 crc kubenswrapper[4766]: I1209 04:55:36.627973 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hpc" event={"ID":"65dc7556-90c1-49fa-9305-4e5c2fb52381","Type":"ContainerStarted","Data":"94c474b91515b3abd52ab64bcff753b6e50ac8ae447163e1a389fbbeb70c99bd"} Dec 09 04:55:36 crc kubenswrapper[4766]: I1209 04:55:36.630018 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 04:55:38 crc kubenswrapper[4766]: I1209 04:55:38.657383 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hpc" event={"ID":"65dc7556-90c1-49fa-9305-4e5c2fb52381","Type":"ContainerStarted","Data":"6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790"} Dec 09 04:55:39 crc kubenswrapper[4766]: I1209 04:55:39.667269 4766 generic.go:334] "Generic (PLEG): container finished" podID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerID="6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790" exitCode=0 Dec 09 04:55:39 crc kubenswrapper[4766]: I1209 04:55:39.667320 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hpc" event={"ID":"65dc7556-90c1-49fa-9305-4e5c2fb52381","Type":"ContainerDied","Data":"6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790"} Dec 09 04:55:40 crc kubenswrapper[4766]: I1209 04:55:40.571697 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:40 crc kubenswrapper[4766]: I1209 04:55:40.697331 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hpc" event={"ID":"65dc7556-90c1-49fa-9305-4e5c2fb52381","Type":"ContainerStarted","Data":"cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf"} Dec 09 04:55:40 crc kubenswrapper[4766]: I1209 04:55:40.732158 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-44hpc" podStartSLOduration=2.303783359 podStartE2EDuration="5.732123306s" podCreationTimestamp="2025-12-09 04:55:35 +0000 UTC" firstStartedPulling="2025-12-09 04:55:36.629819884 +0000 UTC m=+6218.339125310" lastFinishedPulling="2025-12-09 04:55:40.058159831 +0000 UTC m=+6221.767465257" observedRunningTime="2025-12-09 04:55:40.727823 +0000 UTC m=+6222.437128436" watchObservedRunningTime="2025-12-09 04:55:40.732123306 +0000 UTC m=+6222.441428732" Dec 09 04:55:41 crc kubenswrapper[4766]: I1209 04:55:41.798252 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:42 crc kubenswrapper[4766]: I1209 04:55:42.062238 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xbw7w"] Dec 09 04:55:42 crc kubenswrapper[4766]: I1209 04:55:42.079279 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xbw7w"] Dec 09 04:55:42 crc kubenswrapper[4766]: I1209 04:55:42.097315 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c01d-account-create-update-jxlnd"] Dec 09 04:55:42 crc kubenswrapper[4766]: I1209 04:55:42.107043 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c01d-account-create-update-jxlnd"] Dec 09 04:55:42 crc kubenswrapper[4766]: I1209 04:55:42.256363 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:55:42 crc kubenswrapper[4766]: I1209 04:55:42.852797 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713b0c31-e7ca-4673-90c1-f90879edd2ec" path="/var/lib/kubelet/pods/713b0c31-e7ca-4673-90c1-f90879edd2ec/volumes" Dec 09 04:55:42 crc kubenswrapper[4766]: I1209 04:55:42.854270 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a4b16b-b72d-4964-9b44-6b276ecf2eeb" path="/var/lib/kubelet/pods/b1a4b16b-b72d-4964-9b44-6b276ecf2eeb/volumes" Dec 09 04:55:43 crc kubenswrapper[4766]: I1209 04:55:43.428698 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:55:43 crc kubenswrapper[4766]: I1209 04:55:43.533482 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75d7cb49fc-4rvvd"] Dec 09 04:55:43 crc kubenswrapper[4766]: I1209 04:55:43.533712 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75d7cb49fc-4rvvd" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon-log" containerID="cri-o://9bd107bc516b0734d0998795d29c161752cf3343b49d1ea8e7e5e0a090d8e3b3" gracePeriod=30 Dec 09 04:55:43 crc kubenswrapper[4766]: I1209 04:55:43.534165 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75d7cb49fc-4rvvd" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon" containerID="cri-o://52d469056cb01204be65e8718d26cea128b97bd3a1227c05c07d9dc7b7882311" gracePeriod=30 Dec 09 04:55:45 crc kubenswrapper[4766]: I1209 04:55:45.817083 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:45 crc kubenswrapper[4766]: I1209 04:55:45.817862 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:45 crc kubenswrapper[4766]: I1209 04:55:45.894809 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:46 crc kubenswrapper[4766]: I1209 04:55:46.769487 4766 generic.go:334] "Generic (PLEG): container finished" podID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerID="52d469056cb01204be65e8718d26cea128b97bd3a1227c05c07d9dc7b7882311" exitCode=0 Dec 09 04:55:46 crc kubenswrapper[4766]: I1209 04:55:46.769579 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75d7cb49fc-4rvvd" event={"ID":"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d","Type":"ContainerDied","Data":"52d469056cb01204be65e8718d26cea128b97bd3a1227c05c07d9dc7b7882311"} Dec 09 04:55:46 crc kubenswrapper[4766]: I1209 04:55:46.860814 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:46 crc kubenswrapper[4766]: I1209 04:55:46.926180 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44hpc"] Dec 09 04:55:47 crc kubenswrapper[4766]: I1209 04:55:47.779932 4766 generic.go:334] "Generic (PLEG): container finished" podID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerID="d193497fad6ecc979ed3ecb60088f51cc34576887bd67816a886011691eebf10" exitCode=137 Dec 09 04:55:47 crc kubenswrapper[4766]: I1209 04:55:47.780441 4766 generic.go:334] "Generic (PLEG): container finished" podID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerID="20007cb3bd525f6fb6b4c1b898c82340c508c9f7df88df0cd2e5c6192292d036" exitCode=137 Dec 09 04:55:47 crc kubenswrapper[4766]: I1209 04:55:47.779968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d89bf46f-gtqkg" event={"ID":"dcb98041-c67b-46a7-b16b-74eeebbee361","Type":"ContainerDied","Data":"d193497fad6ecc979ed3ecb60088f51cc34576887bd67816a886011691eebf10"} Dec 09 04:55:47 crc kubenswrapper[4766]: I1209 04:55:47.781100 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d89bf46f-gtqkg" event={"ID":"dcb98041-c67b-46a7-b16b-74eeebbee361","Type":"ContainerDied","Data":"20007cb3bd525f6fb6b4c1b898c82340c508c9f7df88df0cd2e5c6192292d036"} Dec 09 04:55:47 crc kubenswrapper[4766]: I1209 04:55:47.781121 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d89bf46f-gtqkg" event={"ID":"dcb98041-c67b-46a7-b16b-74eeebbee361","Type":"ContainerDied","Data":"183a5cfe694e1ad05020a9ad67c4f717773a67441572c56323d359a3712c0bb0"} Dec 09 04:55:47 crc kubenswrapper[4766]: I1209 04:55:47.781133 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="183a5cfe694e1ad05020a9ad67c4f717773a67441572c56323d359a3712c0bb0" Dec 09 04:55:47 crc kubenswrapper[4766]: I1209 04:55:47.851157 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.001068 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.43:9292/healthcheck\": dial tcp 10.217.1.43:9292: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.001100 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="91eb833a-69ae-4fee-bf91-984f74e2291f" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.43:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.043581 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfcmx\" (UniqueName: \"kubernetes.io/projected/dcb98041-c67b-46a7-b16b-74eeebbee361-kube-api-access-jfcmx\") pod \"dcb98041-c67b-46a7-b16b-74eeebbee361\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.043657 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcb98041-c67b-46a7-b16b-74eeebbee361-horizon-secret-key\") pod \"dcb98041-c67b-46a7-b16b-74eeebbee361\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.043797 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb98041-c67b-46a7-b16b-74eeebbee361-logs\") pod \"dcb98041-c67b-46a7-b16b-74eeebbee361\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.043919 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-scripts\") pod \"dcb98041-c67b-46a7-b16b-74eeebbee361\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.044527 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb98041-c67b-46a7-b16b-74eeebbee361-logs" (OuterVolumeSpecName: "logs") pod "dcb98041-c67b-46a7-b16b-74eeebbee361" (UID: "dcb98041-c67b-46a7-b16b-74eeebbee361"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.044907 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-config-data\") pod \"dcb98041-c67b-46a7-b16b-74eeebbee361\" (UID: \"dcb98041-c67b-46a7-b16b-74eeebbee361\") " Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.045711 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb98041-c67b-46a7-b16b-74eeebbee361-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.050469 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb98041-c67b-46a7-b16b-74eeebbee361-kube-api-access-jfcmx" (OuterVolumeSpecName: "kube-api-access-jfcmx") pod "dcb98041-c67b-46a7-b16b-74eeebbee361" (UID: "dcb98041-c67b-46a7-b16b-74eeebbee361"). InnerVolumeSpecName "kube-api-access-jfcmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.053387 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb98041-c67b-46a7-b16b-74eeebbee361-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dcb98041-c67b-46a7-b16b-74eeebbee361" (UID: "dcb98041-c67b-46a7-b16b-74eeebbee361"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.071199 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-config-data" (OuterVolumeSpecName: "config-data") pod "dcb98041-c67b-46a7-b16b-74eeebbee361" (UID: "dcb98041-c67b-46a7-b16b-74eeebbee361"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.090004 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-scripts" (OuterVolumeSpecName: "scripts") pod "dcb98041-c67b-46a7-b16b-74eeebbee361" (UID: "dcb98041-c67b-46a7-b16b-74eeebbee361"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.148782 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.148837 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb98041-c67b-46a7-b16b-74eeebbee361-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.148860 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfcmx\" (UniqueName: \"kubernetes.io/projected/dcb98041-c67b-46a7-b16b-74eeebbee361-kube-api-access-jfcmx\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.148881 4766 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcb98041-c67b-46a7-b16b-74eeebbee361-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.762959 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75d7cb49fc-4rvvd" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.801299 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d89bf46f-gtqkg" Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.801496 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-44hpc" podUID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerName="registry-server" containerID="cri-o://cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf" gracePeriod=2 Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.856493 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69d89bf46f-gtqkg"] Dec 09 04:55:48 crc kubenswrapper[4766]: I1209 04:55:48.871759 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-69d89bf46f-gtqkg"] Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.286016 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.478671 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jgjh\" (UniqueName: \"kubernetes.io/projected/65dc7556-90c1-49fa-9305-4e5c2fb52381-kube-api-access-2jgjh\") pod \"65dc7556-90c1-49fa-9305-4e5c2fb52381\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.478841 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-utilities\") pod \"65dc7556-90c1-49fa-9305-4e5c2fb52381\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.478936 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-catalog-content\") pod \"65dc7556-90c1-49fa-9305-4e5c2fb52381\" (UID: \"65dc7556-90c1-49fa-9305-4e5c2fb52381\") " Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.483304 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-utilities" (OuterVolumeSpecName: "utilities") pod "65dc7556-90c1-49fa-9305-4e5c2fb52381" (UID: "65dc7556-90c1-49fa-9305-4e5c2fb52381"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.506161 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65dc7556-90c1-49fa-9305-4e5c2fb52381-kube-api-access-2jgjh" (OuterVolumeSpecName: "kube-api-access-2jgjh") pod "65dc7556-90c1-49fa-9305-4e5c2fb52381" (UID: "65dc7556-90c1-49fa-9305-4e5c2fb52381"). InnerVolumeSpecName "kube-api-access-2jgjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.579360 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65dc7556-90c1-49fa-9305-4e5c2fb52381" (UID: "65dc7556-90c1-49fa-9305-4e5c2fb52381"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.582645 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jgjh\" (UniqueName: \"kubernetes.io/projected/65dc7556-90c1-49fa-9305-4e5c2fb52381-kube-api-access-2jgjh\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.582704 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.582727 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dc7556-90c1-49fa-9305-4e5c2fb52381-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.820494 4766 generic.go:334] "Generic (PLEG): container finished" podID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerID="cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf" exitCode=0 Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.820565 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hpc" event={"ID":"65dc7556-90c1-49fa-9305-4e5c2fb52381","Type":"ContainerDied","Data":"cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf"} Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.820618 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hpc" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.820657 4766 scope.go:117] "RemoveContainer" containerID="cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.820632 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hpc" event={"ID":"65dc7556-90c1-49fa-9305-4e5c2fb52381","Type":"ContainerDied","Data":"94c474b91515b3abd52ab64bcff753b6e50ac8ae447163e1a389fbbeb70c99bd"} Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.881640 4766 scope.go:117] "RemoveContainer" containerID="6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.900721 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44hpc"] Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.919447 4766 scope.go:117] "RemoveContainer" containerID="a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.921297 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-44hpc"] Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.968379 4766 scope.go:117] "RemoveContainer" containerID="cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf" Dec 09 04:55:49 crc kubenswrapper[4766]: E1209 04:55:49.968897 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf\": container with ID starting with cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf not found: ID does not exist" containerID="cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.968957 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf"} err="failed to get container status \"cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf\": rpc error: code = NotFound desc = could not find container \"cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf\": container with ID starting with cfdf7e7a0206d9ca973f2b64ad9b2dfde7be8b3298aff4e52f617f29c0b31fcf not found: ID does not exist" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.968990 4766 scope.go:117] "RemoveContainer" containerID="6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790" Dec 09 04:55:49 crc kubenswrapper[4766]: E1209 04:55:49.969393 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790\": container with ID starting with 6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790 not found: ID does not exist" containerID="6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.969424 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790"} err="failed to get container status \"6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790\": rpc error: code = NotFound desc = could not find container \"6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790\": container with ID starting with 6ee5723248ae8cde43f3f501827ca0ae42f3e3970d72ab5fbabc7c177d8e5790 not found: ID does not exist" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.969442 4766 scope.go:117] "RemoveContainer" containerID="a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076" Dec 09 04:55:49 crc kubenswrapper[4766]: E1209 04:55:49.969694 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076\": container with ID starting with a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076 not found: ID does not exist" containerID="a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076" Dec 09 04:55:49 crc kubenswrapper[4766]: I1209 04:55:49.969735 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076"} err="failed to get container status \"a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076\": rpc error: code = NotFound desc = could not find container \"a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076\": container with ID starting with a6391840d4b70fcecaa17caf4b3acc3422577af6d99264658992d9b38b366076 not found: ID does not exist" Dec 09 04:55:50 crc kubenswrapper[4766]: I1209 04:55:50.038575 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cff6f"] Dec 09 04:55:50 crc kubenswrapper[4766]: I1209 04:55:50.050727 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cff6f"] Dec 09 04:55:50 crc kubenswrapper[4766]: I1209 04:55:50.855432 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65dc7556-90c1-49fa-9305-4e5c2fb52381" path="/var/lib/kubelet/pods/65dc7556-90c1-49fa-9305-4e5c2fb52381/volumes" Dec 09 04:55:50 crc kubenswrapper[4766]: I1209 04:55:50.856431 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da38c4b9-7fb3-44be-a93b-bbad803dd227" path="/var/lib/kubelet/pods/da38c4b9-7fb3-44be-a93b-bbad803dd227/volumes" Dec 09 04:55:50 crc kubenswrapper[4766]: I1209 04:55:50.857326 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb98041-c67b-46a7-b16b-74eeebbee361" path="/var/lib/kubelet/pods/dcb98041-c67b-46a7-b16b-74eeebbee361/volumes" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.581601 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b6cc9f797-qt9w7"] Dec 09 04:55:51 crc kubenswrapper[4766]: E1209 04:55:51.582006 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerName="extract-utilities" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.582025 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerName="extract-utilities" Dec 09 04:55:51 crc kubenswrapper[4766]: E1209 04:55:51.582037 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerName="extract-content" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.582044 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerName="extract-content" Dec 09 04:55:51 crc kubenswrapper[4766]: E1209 04:55:51.582063 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerName="horizon-log" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.582069 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerName="horizon-log" Dec 09 04:55:51 crc kubenswrapper[4766]: E1209 04:55:51.582092 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerName="horizon" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.582099 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerName="horizon" Dec 09 04:55:51 crc kubenswrapper[4766]: E1209 04:55:51.582110 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerName="registry-server" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.582116 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerName="registry-server" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.582327 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerName="horizon" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.582344 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="65dc7556-90c1-49fa-9305-4e5c2fb52381" containerName="registry-server" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.582353 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb98041-c67b-46a7-b16b-74eeebbee361" containerName="horizon-log" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.583351 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.598045 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b6cc9f797-qt9w7"] Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.734347 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6f4aeda-1525-486a-83d0-9cb677713681-horizon-secret-key\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.734548 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f4aeda-1525-486a-83d0-9cb677713681-logs\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.734615 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6f4aeda-1525-486a-83d0-9cb677713681-scripts\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.734651 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrqfs\" (UniqueName: \"kubernetes.io/projected/d6f4aeda-1525-486a-83d0-9cb677713681-kube-api-access-lrqfs\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.734676 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6f4aeda-1525-486a-83d0-9cb677713681-config-data\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.836699 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f4aeda-1525-486a-83d0-9cb677713681-logs\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.836772 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6f4aeda-1525-486a-83d0-9cb677713681-scripts\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.836808 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrqfs\" (UniqueName: \"kubernetes.io/projected/d6f4aeda-1525-486a-83d0-9cb677713681-kube-api-access-lrqfs\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.836829 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6f4aeda-1525-486a-83d0-9cb677713681-config-data\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.836897 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6f4aeda-1525-486a-83d0-9cb677713681-horizon-secret-key\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.837242 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f4aeda-1525-486a-83d0-9cb677713681-logs\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.837863 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6f4aeda-1525-486a-83d0-9cb677713681-scripts\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.838269 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6f4aeda-1525-486a-83d0-9cb677713681-config-data\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.846823 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6f4aeda-1525-486a-83d0-9cb677713681-horizon-secret-key\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.857390 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrqfs\" (UniqueName: \"kubernetes.io/projected/d6f4aeda-1525-486a-83d0-9cb677713681-kube-api-access-lrqfs\") pod \"horizon-7b6cc9f797-qt9w7\" (UID: \"d6f4aeda-1525-486a-83d0-9cb677713681\") " pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:51 crc kubenswrapper[4766]: I1209 04:55:51.899738 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:55:52 crc kubenswrapper[4766]: I1209 04:55:52.462341 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b6cc9f797-qt9w7"] Dec 09 04:55:52 crc kubenswrapper[4766]: I1209 04:55:52.866853 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6cc9f797-qt9w7" event={"ID":"d6f4aeda-1525-486a-83d0-9cb677713681","Type":"ContainerStarted","Data":"39d428daa4418199d8b643964a64648e3b10c0ff5c0da28d02fcb03d04208740"} Dec 09 04:55:52 crc kubenswrapper[4766]: I1209 04:55:52.867248 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6cc9f797-qt9w7" event={"ID":"d6f4aeda-1525-486a-83d0-9cb677713681","Type":"ContainerStarted","Data":"1eb440dd762b53c4e8387e9f5d6e2b3cf58ba978e4bc6cf755eddac22bca012a"} Dec 09 04:55:52 crc kubenswrapper[4766]: I1209 04:55:52.867264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b6cc9f797-qt9w7" event={"ID":"d6f4aeda-1525-486a-83d0-9cb677713681","Type":"ContainerStarted","Data":"92d68161ec6457d125761d5d4aabb62f7d1b6c643305a98206de9b450f22a856"} Dec 09 04:55:52 crc kubenswrapper[4766]: I1209 04:55:52.904293 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b6cc9f797-qt9w7" podStartSLOduration=1.904266958 podStartE2EDuration="1.904266958s" podCreationTimestamp="2025-12-09 04:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:55:52.890010162 +0000 UTC m=+6234.599315598" watchObservedRunningTime="2025-12-09 04:55:52.904266958 +0000 UTC m=+6234.613572384" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.176678 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-vn8fq"] Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.178437 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.195670 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-vn8fq"] Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.201369 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trncl\" (UniqueName: \"kubernetes.io/projected/d12df1eb-da3e-4256-8889-4885f8e0286d-kube-api-access-trncl\") pod \"heat-db-create-vn8fq\" (UID: \"d12df1eb-da3e-4256-8889-4885f8e0286d\") " pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.201418 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d12df1eb-da3e-4256-8889-4885f8e0286d-operator-scripts\") pod \"heat-db-create-vn8fq\" (UID: \"d12df1eb-da3e-4256-8889-4885f8e0286d\") " pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.290791 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-2e57-account-create-update-74nhw"] Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.291939 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.294114 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.303378 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trncl\" (UniqueName: \"kubernetes.io/projected/d12df1eb-da3e-4256-8889-4885f8e0286d-kube-api-access-trncl\") pod \"heat-db-create-vn8fq\" (UID: \"d12df1eb-da3e-4256-8889-4885f8e0286d\") " pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.303434 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d12df1eb-da3e-4256-8889-4885f8e0286d-operator-scripts\") pod \"heat-db-create-vn8fq\" (UID: \"d12df1eb-da3e-4256-8889-4885f8e0286d\") " pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.303475 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtckd\" (UniqueName: \"kubernetes.io/projected/57276a31-2e51-48c1-8740-f2e9fcec30f3-kube-api-access-jtckd\") pod \"heat-2e57-account-create-update-74nhw\" (UID: \"57276a31-2e51-48c1-8740-f2e9fcec30f3\") " pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.303510 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57276a31-2e51-48c1-8740-f2e9fcec30f3-operator-scripts\") pod \"heat-2e57-account-create-update-74nhw\" (UID: \"57276a31-2e51-48c1-8740-f2e9fcec30f3\") " pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.304259 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d12df1eb-da3e-4256-8889-4885f8e0286d-operator-scripts\") pod \"heat-db-create-vn8fq\" (UID: \"d12df1eb-da3e-4256-8889-4885f8e0286d\") " pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.316775 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2e57-account-create-update-74nhw"] Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.339852 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trncl\" (UniqueName: \"kubernetes.io/projected/d12df1eb-da3e-4256-8889-4885f8e0286d-kube-api-access-trncl\") pod \"heat-db-create-vn8fq\" (UID: \"d12df1eb-da3e-4256-8889-4885f8e0286d\") " pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.405233 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtckd\" (UniqueName: \"kubernetes.io/projected/57276a31-2e51-48c1-8740-f2e9fcec30f3-kube-api-access-jtckd\") pod \"heat-2e57-account-create-update-74nhw\" (UID: \"57276a31-2e51-48c1-8740-f2e9fcec30f3\") " pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.405293 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57276a31-2e51-48c1-8740-f2e9fcec30f3-operator-scripts\") pod \"heat-2e57-account-create-update-74nhw\" (UID: \"57276a31-2e51-48c1-8740-f2e9fcec30f3\") " pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.406006 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57276a31-2e51-48c1-8740-f2e9fcec30f3-operator-scripts\") pod \"heat-2e57-account-create-update-74nhw\" (UID: \"57276a31-2e51-48c1-8740-f2e9fcec30f3\") " pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.421918 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtckd\" (UniqueName: \"kubernetes.io/projected/57276a31-2e51-48c1-8740-f2e9fcec30f3-kube-api-access-jtckd\") pod \"heat-2e57-account-create-update-74nhw\" (UID: \"57276a31-2e51-48c1-8740-f2e9fcec30f3\") " pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.545759 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.611541 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:53 crc kubenswrapper[4766]: I1209 04:55:53.990764 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-vn8fq"] Dec 09 04:55:53 crc kubenswrapper[4766]: W1209 04:55:53.997345 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd12df1eb_da3e_4256_8889_4885f8e0286d.slice/crio-02c97d219a128e6e5b5d77460a095113ed213dac02cb0b63d74bea02ca055e46 WatchSource:0}: Error finding container 02c97d219a128e6e5b5d77460a095113ed213dac02cb0b63d74bea02ca055e46: Status 404 returned error can't find the container with id 02c97d219a128e6e5b5d77460a095113ed213dac02cb0b63d74bea02ca055e46 Dec 09 04:55:54 crc kubenswrapper[4766]: I1209 04:55:54.106558 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2e57-account-create-update-74nhw"] Dec 09 04:55:54 crc kubenswrapper[4766]: W1209 04:55:54.127783 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57276a31_2e51_48c1_8740_f2e9fcec30f3.slice/crio-240c92acb1df63beb3070982616d94041993bfdcf8693f69d1087c0d2224af75 WatchSource:0}: Error finding container 240c92acb1df63beb3070982616d94041993bfdcf8693f69d1087c0d2224af75: Status 404 returned error can't find the container with id 240c92acb1df63beb3070982616d94041993bfdcf8693f69d1087c0d2224af75 Dec 09 04:55:54 crc kubenswrapper[4766]: I1209 04:55:54.895365 4766 generic.go:334] "Generic (PLEG): container finished" podID="57276a31-2e51-48c1-8740-f2e9fcec30f3" containerID="89dccbeaf58b046de7e35de63836973239179f8354f69e6dbf2ec1959dcf6029" exitCode=0 Dec 09 04:55:54 crc kubenswrapper[4766]: I1209 04:55:54.895442 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2e57-account-create-update-74nhw" event={"ID":"57276a31-2e51-48c1-8740-f2e9fcec30f3","Type":"ContainerDied","Data":"89dccbeaf58b046de7e35de63836973239179f8354f69e6dbf2ec1959dcf6029"} Dec 09 04:55:54 crc kubenswrapper[4766]: I1209 04:55:54.895768 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2e57-account-create-update-74nhw" event={"ID":"57276a31-2e51-48c1-8740-f2e9fcec30f3","Type":"ContainerStarted","Data":"240c92acb1df63beb3070982616d94041993bfdcf8693f69d1087c0d2224af75"} Dec 09 04:55:54 crc kubenswrapper[4766]: I1209 04:55:54.898664 4766 generic.go:334] "Generic (PLEG): container finished" podID="d12df1eb-da3e-4256-8889-4885f8e0286d" containerID="cb536fafaeaf8819edd8ffef7694e5f18ac4237ca4383145032004d7ff8a01e2" exitCode=0 Dec 09 04:55:54 crc kubenswrapper[4766]: I1209 04:55:54.898705 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-vn8fq" event={"ID":"d12df1eb-da3e-4256-8889-4885f8e0286d","Type":"ContainerDied","Data":"cb536fafaeaf8819edd8ffef7694e5f18ac4237ca4383145032004d7ff8a01e2"} Dec 09 04:55:54 crc kubenswrapper[4766]: I1209 04:55:54.898730 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-vn8fq" event={"ID":"d12df1eb-da3e-4256-8889-4885f8e0286d","Type":"ContainerStarted","Data":"02c97d219a128e6e5b5d77460a095113ed213dac02cb0b63d74bea02ca055e46"} Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.423102 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.432840 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.477352 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57276a31-2e51-48c1-8740-f2e9fcec30f3-operator-scripts\") pod \"57276a31-2e51-48c1-8740-f2e9fcec30f3\" (UID: \"57276a31-2e51-48c1-8740-f2e9fcec30f3\") " Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.477421 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d12df1eb-da3e-4256-8889-4885f8e0286d-operator-scripts\") pod \"d12df1eb-da3e-4256-8889-4885f8e0286d\" (UID: \"d12df1eb-da3e-4256-8889-4885f8e0286d\") " Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.477475 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trncl\" (UniqueName: \"kubernetes.io/projected/d12df1eb-da3e-4256-8889-4885f8e0286d-kube-api-access-trncl\") pod \"d12df1eb-da3e-4256-8889-4885f8e0286d\" (UID: \"d12df1eb-da3e-4256-8889-4885f8e0286d\") " Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.477606 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtckd\" (UniqueName: \"kubernetes.io/projected/57276a31-2e51-48c1-8740-f2e9fcec30f3-kube-api-access-jtckd\") pod \"57276a31-2e51-48c1-8740-f2e9fcec30f3\" (UID: \"57276a31-2e51-48c1-8740-f2e9fcec30f3\") " Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.478097 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57276a31-2e51-48c1-8740-f2e9fcec30f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57276a31-2e51-48c1-8740-f2e9fcec30f3" (UID: "57276a31-2e51-48c1-8740-f2e9fcec30f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.478762 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d12df1eb-da3e-4256-8889-4885f8e0286d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d12df1eb-da3e-4256-8889-4885f8e0286d" (UID: "d12df1eb-da3e-4256-8889-4885f8e0286d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.488626 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57276a31-2e51-48c1-8740-f2e9fcec30f3-kube-api-access-jtckd" (OuterVolumeSpecName: "kube-api-access-jtckd") pod "57276a31-2e51-48c1-8740-f2e9fcec30f3" (UID: "57276a31-2e51-48c1-8740-f2e9fcec30f3"). InnerVolumeSpecName "kube-api-access-jtckd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.488814 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12df1eb-da3e-4256-8889-4885f8e0286d-kube-api-access-trncl" (OuterVolumeSpecName: "kube-api-access-trncl") pod "d12df1eb-da3e-4256-8889-4885f8e0286d" (UID: "d12df1eb-da3e-4256-8889-4885f8e0286d"). InnerVolumeSpecName "kube-api-access-trncl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.580600 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtckd\" (UniqueName: \"kubernetes.io/projected/57276a31-2e51-48c1-8740-f2e9fcec30f3-kube-api-access-jtckd\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.580642 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57276a31-2e51-48c1-8740-f2e9fcec30f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.580658 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d12df1eb-da3e-4256-8889-4885f8e0286d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.580674 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trncl\" (UniqueName: \"kubernetes.io/projected/d12df1eb-da3e-4256-8889-4885f8e0286d-kube-api-access-trncl\") on node \"crc\" DevicePath \"\"" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.918148 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2e57-account-create-update-74nhw" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.918146 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2e57-account-create-update-74nhw" event={"ID":"57276a31-2e51-48c1-8740-f2e9fcec30f3","Type":"ContainerDied","Data":"240c92acb1df63beb3070982616d94041993bfdcf8693f69d1087c0d2224af75"} Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.918251 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="240c92acb1df63beb3070982616d94041993bfdcf8693f69d1087c0d2224af75" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.919866 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-vn8fq" event={"ID":"d12df1eb-da3e-4256-8889-4885f8e0286d","Type":"ContainerDied","Data":"02c97d219a128e6e5b5d77460a095113ed213dac02cb0b63d74bea02ca055e46"} Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.919910 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02c97d219a128e6e5b5d77460a095113ed213dac02cb0b63d74bea02ca055e46" Dec 09 04:55:56 crc kubenswrapper[4766]: I1209 04:55:56.919929 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-vn8fq" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.408734 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-cjmsh"] Dec 09 04:55:58 crc kubenswrapper[4766]: E1209 04:55:58.409509 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12df1eb-da3e-4256-8889-4885f8e0286d" containerName="mariadb-database-create" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.409529 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12df1eb-da3e-4256-8889-4885f8e0286d" containerName="mariadb-database-create" Dec 09 04:55:58 crc kubenswrapper[4766]: E1209 04:55:58.409566 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57276a31-2e51-48c1-8740-f2e9fcec30f3" containerName="mariadb-account-create-update" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.409575 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="57276a31-2e51-48c1-8740-f2e9fcec30f3" containerName="mariadb-account-create-update" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.409846 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12df1eb-da3e-4256-8889-4885f8e0286d" containerName="mariadb-database-create" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.409881 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="57276a31-2e51-48c1-8740-f2e9fcec30f3" containerName="mariadb-account-create-update" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.410741 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.412985 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-hm4kz" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.413241 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.424246 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cjmsh"] Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.532003 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfw8x\" (UniqueName: \"kubernetes.io/projected/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-kube-api-access-gfw8x\") pod \"heat-db-sync-cjmsh\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.532243 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-config-data\") pod \"heat-db-sync-cjmsh\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.532314 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-combined-ca-bundle\") pod \"heat-db-sync-cjmsh\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.634270 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-config-data\") pod \"heat-db-sync-cjmsh\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.634313 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-combined-ca-bundle\") pod \"heat-db-sync-cjmsh\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.634436 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfw8x\" (UniqueName: \"kubernetes.io/projected/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-kube-api-access-gfw8x\") pod \"heat-db-sync-cjmsh\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.653196 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-combined-ca-bundle\") pod \"heat-db-sync-cjmsh\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.657406 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-config-data\") pod \"heat-db-sync-cjmsh\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.658067 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfw8x\" (UniqueName: \"kubernetes.io/projected/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-kube-api-access-gfw8x\") pod \"heat-db-sync-cjmsh\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.765756 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75d7cb49fc-4rvvd" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Dec 09 04:55:58 crc kubenswrapper[4766]: I1209 04:55:58.766125 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cjmsh" Dec 09 04:55:59 crc kubenswrapper[4766]: I1209 04:55:59.258371 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cjmsh"] Dec 09 04:55:59 crc kubenswrapper[4766]: I1209 04:55:59.963123 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cjmsh" event={"ID":"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e","Type":"ContainerStarted","Data":"866eff84a89ea58b86366ab9b0773aa24f736f8dfff1c8b0472ea9e3bcab9626"} Dec 09 04:56:01 crc kubenswrapper[4766]: I1209 04:56:01.900346 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:56:01 crc kubenswrapper[4766]: I1209 04:56:01.903574 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:56:07 crc kubenswrapper[4766]: I1209 04:56:07.055053 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cjmsh" event={"ID":"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e","Type":"ContainerStarted","Data":"43efc8fbaea50861dce629607ecc27ca03fba73e8f625129ff4550aae0861893"} Dec 09 04:56:07 crc kubenswrapper[4766]: I1209 04:56:07.090789 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-cjmsh" podStartSLOduration=1.8104367510000001 podStartE2EDuration="9.090764441s" podCreationTimestamp="2025-12-09 04:55:58 +0000 UTC" firstStartedPulling="2025-12-09 04:55:59.252835962 +0000 UTC m=+6240.962141388" lastFinishedPulling="2025-12-09 04:56:06.533163652 +0000 UTC m=+6248.242469078" observedRunningTime="2025-12-09 04:56:07.076793173 +0000 UTC m=+6248.786098609" watchObservedRunningTime="2025-12-09 04:56:07.090764441 +0000 UTC m=+6248.800069867" Dec 09 04:56:08 crc kubenswrapper[4766]: I1209 04:56:08.763391 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75d7cb49fc-4rvvd" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Dec 09 04:56:08 crc kubenswrapper[4766]: I1209 04:56:08.763979 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:56:10 crc kubenswrapper[4766]: I1209 04:56:10.090823 4766 generic.go:334] "Generic (PLEG): container finished" podID="66d1d7bd-b62d-4f48-b72b-b9de5ecd611e" containerID="43efc8fbaea50861dce629607ecc27ca03fba73e8f625129ff4550aae0861893" exitCode=0 Dec 09 04:56:10 crc kubenswrapper[4766]: I1209 04:56:10.091644 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cjmsh" event={"ID":"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e","Type":"ContainerDied","Data":"43efc8fbaea50861dce629607ecc27ca03fba73e8f625129ff4550aae0861893"} Dec 09 04:56:11 crc kubenswrapper[4766]: I1209 04:56:11.564870 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cjmsh" Dec 09 04:56:11 crc kubenswrapper[4766]: I1209 04:56:11.764860 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-combined-ca-bundle\") pod \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " Dec 09 04:56:11 crc kubenswrapper[4766]: I1209 04:56:11.765023 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-config-data\") pod \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " Dec 09 04:56:11 crc kubenswrapper[4766]: I1209 04:56:11.765304 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfw8x\" (UniqueName: \"kubernetes.io/projected/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-kube-api-access-gfw8x\") pod \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\" (UID: \"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e\") " Dec 09 04:56:11 crc kubenswrapper[4766]: I1209 04:56:11.777456 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-kube-api-access-gfw8x" (OuterVolumeSpecName: "kube-api-access-gfw8x") pod "66d1d7bd-b62d-4f48-b72b-b9de5ecd611e" (UID: "66d1d7bd-b62d-4f48-b72b-b9de5ecd611e"). InnerVolumeSpecName "kube-api-access-gfw8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:56:11 crc kubenswrapper[4766]: I1209 04:56:11.823427 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66d1d7bd-b62d-4f48-b72b-b9de5ecd611e" (UID: "66d1d7bd-b62d-4f48-b72b-b9de5ecd611e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:56:11 crc kubenswrapper[4766]: I1209 04:56:11.869545 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfw8x\" (UniqueName: \"kubernetes.io/projected/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-kube-api-access-gfw8x\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:11 crc kubenswrapper[4766]: I1209 04:56:11.869578 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:11 crc kubenswrapper[4766]: I1209 04:56:11.977510 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-config-data" (OuterVolumeSpecName: "config-data") pod "66d1d7bd-b62d-4f48-b72b-b9de5ecd611e" (UID: "66d1d7bd-b62d-4f48-b72b-b9de5ecd611e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:56:12 crc kubenswrapper[4766]: I1209 04:56:12.072386 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:12 crc kubenswrapper[4766]: I1209 04:56:12.119276 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cjmsh" event={"ID":"66d1d7bd-b62d-4f48-b72b-b9de5ecd611e","Type":"ContainerDied","Data":"866eff84a89ea58b86366ab9b0773aa24f736f8dfff1c8b0472ea9e3bcab9626"} Dec 09 04:56:12 crc kubenswrapper[4766]: I1209 04:56:12.119328 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="866eff84a89ea58b86366ab9b0773aa24f736f8dfff1c8b0472ea9e3bcab9626" Dec 09 04:56:12 crc kubenswrapper[4766]: I1209 04:56:12.119388 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cjmsh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.410705 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-674c694cd8-x85ld"] Dec 09 04:56:13 crc kubenswrapper[4766]: E1209 04:56:13.411295 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d1d7bd-b62d-4f48-b72b-b9de5ecd611e" containerName="heat-db-sync" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.411306 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d1d7bd-b62d-4f48-b72b-b9de5ecd611e" containerName="heat-db-sync" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.411488 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d1d7bd-b62d-4f48-b72b-b9de5ecd611e" containerName="heat-db-sync" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.412184 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.419845 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-hm4kz" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.419949 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.420069 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.437071 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-674c694cd8-x85ld"] Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.511135 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-75dd6ffcd6-x4r8x"] Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.516738 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.518781 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.543656 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-75dd6ffcd6-x4r8x"] Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.609666 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17e303d-fda5-45e4-a409-56c58f010d9e-combined-ca-bundle\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.609717 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b17e303d-fda5-45e4-a409-56c58f010d9e-config-data\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.609751 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rpvd\" (UniqueName: \"kubernetes.io/projected/4ac6d93b-846d-4e73-a338-172101de6d3c-kube-api-access-2rpvd\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.609788 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac6d93b-846d-4e73-a338-172101de6d3c-combined-ca-bundle\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.609812 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ac6d93b-846d-4e73-a338-172101de6d3c-config-data-custom\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.609856 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b17e303d-fda5-45e4-a409-56c58f010d9e-config-data-custom\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.609871 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsrcb\" (UniqueName: \"kubernetes.io/projected/b17e303d-fda5-45e4-a409-56c58f010d9e-kube-api-access-wsrcb\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.609910 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac6d93b-846d-4e73-a338-172101de6d3c-config-data\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.697246 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-9dcdf99c4-hcdzh"] Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.728028 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.734875 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739123 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-config-data\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739184 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b17e303d-fda5-45e4-a409-56c58f010d9e-config-data-custom\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739208 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsrcb\" (UniqueName: \"kubernetes.io/projected/b17e303d-fda5-45e4-a409-56c58f010d9e-kube-api-access-wsrcb\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739251 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-config-data-custom\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739291 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac6d93b-846d-4e73-a338-172101de6d3c-config-data\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739363 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17e303d-fda5-45e4-a409-56c58f010d9e-combined-ca-bundle\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739392 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b17e303d-fda5-45e4-a409-56c58f010d9e-config-data\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739419 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjkgm\" (UniqueName: \"kubernetes.io/projected/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-kube-api-access-zjkgm\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739450 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rpvd\" (UniqueName: \"kubernetes.io/projected/4ac6d93b-846d-4e73-a338-172101de6d3c-kube-api-access-2rpvd\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739469 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-combined-ca-bundle\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739511 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac6d93b-846d-4e73-a338-172101de6d3c-combined-ca-bundle\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.739542 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ac6d93b-846d-4e73-a338-172101de6d3c-config-data-custom\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.767061 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17e303d-fda5-45e4-a409-56c58f010d9e-combined-ca-bundle\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.767712 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac6d93b-846d-4e73-a338-172101de6d3c-config-data\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.780908 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rpvd\" (UniqueName: \"kubernetes.io/projected/4ac6d93b-846d-4e73-a338-172101de6d3c-kube-api-access-2rpvd\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.794051 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ac6d93b-846d-4e73-a338-172101de6d3c-config-data-custom\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.801180 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsrcb\" (UniqueName: \"kubernetes.io/projected/b17e303d-fda5-45e4-a409-56c58f010d9e-kube-api-access-wsrcb\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.801809 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b17e303d-fda5-45e4-a409-56c58f010d9e-config-data\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.807306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac6d93b-846d-4e73-a338-172101de6d3c-combined-ca-bundle\") pod \"heat-engine-674c694cd8-x85ld\" (UID: \"4ac6d93b-846d-4e73-a338-172101de6d3c\") " pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.830623 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b17e303d-fda5-45e4-a409-56c58f010d9e-config-data-custom\") pod \"heat-api-75dd6ffcd6-x4r8x\" (UID: \"b17e303d-fda5-45e4-a409-56c58f010d9e\") " pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.841075 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.843638 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-config-data\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.843692 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-config-data-custom\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.843782 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjkgm\" (UniqueName: \"kubernetes.io/projected/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-kube-api-access-zjkgm\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.843820 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-combined-ca-bundle\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.873111 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-config-data\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.876991 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-combined-ca-bundle\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.888328 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-9dcdf99c4-hcdzh"] Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.920166 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjkgm\" (UniqueName: \"kubernetes.io/projected/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-kube-api-access-zjkgm\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.920757 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac-config-data-custom\") pod \"heat-cfnapi-9dcdf99c4-hcdzh\" (UID: \"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac\") " pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:13 crc kubenswrapper[4766]: I1209 04:56:13.961971 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.011484 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.053389 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.264701 4766 generic.go:334] "Generic (PLEG): container finished" podID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerID="9bd107bc516b0734d0998795d29c161752cf3343b49d1ea8e7e5e0a090d8e3b3" exitCode=137 Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.265353 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75d7cb49fc-4rvvd" event={"ID":"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d","Type":"ContainerDied","Data":"9bd107bc516b0734d0998795d29c161752cf3343b49d1ea8e7e5e0a090d8e3b3"} Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.320490 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.471617 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-config-data\") pod \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.471756 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-scripts\") pod \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.471845 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqmvw\" (UniqueName: \"kubernetes.io/projected/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-kube-api-access-hqmvw\") pod \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.471962 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-horizon-secret-key\") pod \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.472090 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-logs\") pod \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\" (UID: \"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d\") " Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.473368 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-logs" (OuterVolumeSpecName: "logs") pod "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" (UID: "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.483461 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" (UID: "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.485688 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-kube-api-access-hqmvw" (OuterVolumeSpecName: "kube-api-access-hqmvw") pod "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" (UID: "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d"). InnerVolumeSpecName "kube-api-access-hqmvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.526466 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-scripts" (OuterVolumeSpecName: "scripts") pod "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" (UID: "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.541003 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-config-data" (OuterVolumeSpecName: "config-data") pod "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" (UID: "a6f41119-8dff-4af1-9e94-b9ae91eb4f9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.555190 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-75dd6ffcd6-x4r8x"] Dec 09 04:56:14 crc kubenswrapper[4766]: W1209 04:56:14.560816 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb17e303d_fda5_45e4_a409_56c58f010d9e.slice/crio-87247433a581a8896b3172ecdb7cfba51ee19209c4e8b84aa779c01405a234c2 WatchSource:0}: Error finding container 87247433a581a8896b3172ecdb7cfba51ee19209c4e8b84aa779c01405a234c2: Status 404 returned error can't find the container with id 87247433a581a8896b3172ecdb7cfba51ee19209c4e8b84aa779c01405a234c2 Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.574774 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqmvw\" (UniqueName: \"kubernetes.io/projected/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-kube-api-access-hqmvw\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.574796 4766 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.574806 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.574816 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.574826 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.773225 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-9dcdf99c4-hcdzh"] Dec 09 04:56:14 crc kubenswrapper[4766]: I1209 04:56:14.904957 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-674c694cd8-x85ld"] Dec 09 04:56:14 crc kubenswrapper[4766]: W1209 04:56:14.906200 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ac6d93b_846d_4e73_a338_172101de6d3c.slice/crio-f2ee022724a6946183da04f7a1c79c1c306b0333208e8656c849681f5c7a9547 WatchSource:0}: Error finding container f2ee022724a6946183da04f7a1c79c1c306b0333208e8656c849681f5c7a9547: Status 404 returned error can't find the container with id f2ee022724a6946183da04f7a1c79c1c306b0333208e8656c849681f5c7a9547 Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.277508 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-674c694cd8-x85ld" event={"ID":"4ac6d93b-846d-4e73-a338-172101de6d3c","Type":"ContainerStarted","Data":"dbd699aeea858251b3fef43996ecf2647aa280ecf51362c7dbdf469125934f0a"} Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.277742 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-674c694cd8-x85ld" event={"ID":"4ac6d93b-846d-4e73-a338-172101de6d3c","Type":"ContainerStarted","Data":"f2ee022724a6946183da04f7a1c79c1c306b0333208e8656c849681f5c7a9547"} Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.278761 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.280811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75d7cb49fc-4rvvd" event={"ID":"a6f41119-8dff-4af1-9e94-b9ae91eb4f9d","Type":"ContainerDied","Data":"b2e758a9a7cc43e0a9957b56f7d45e6b1575b39963b766c0310da4ac093895de"} Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.280826 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75d7cb49fc-4rvvd" Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.280842 4766 scope.go:117] "RemoveContainer" containerID="52d469056cb01204be65e8718d26cea128b97bd3a1227c05c07d9dc7b7882311" Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.283121 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" event={"ID":"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac","Type":"ContainerStarted","Data":"48ad0722b91e872821801e2106ae13da45a6668f9c748f03d63da921219c6fe6"} Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.284630 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-75dd6ffcd6-x4r8x" event={"ID":"b17e303d-fda5-45e4-a409-56c58f010d9e","Type":"ContainerStarted","Data":"87247433a581a8896b3172ecdb7cfba51ee19209c4e8b84aa779c01405a234c2"} Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.301776 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-674c694cd8-x85ld" podStartSLOduration=2.301746377 podStartE2EDuration="2.301746377s" podCreationTimestamp="2025-12-09 04:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:56:15.297916504 +0000 UTC m=+6257.007221930" watchObservedRunningTime="2025-12-09 04:56:15.301746377 +0000 UTC m=+6257.011051803" Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.317448 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75d7cb49fc-4rvvd"] Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.325055 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75d7cb49fc-4rvvd"] Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.463675 4766 scope.go:117] "RemoveContainer" containerID="9bd107bc516b0734d0998795d29c161752cf3343b49d1ea8e7e5e0a090d8e3b3" Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.756334 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b6cc9f797-qt9w7" Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.846642 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d55579bb5-sbh89"] Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.847171 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d55579bb5-sbh89" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon-log" containerID="cri-o://72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896" gracePeriod=30 Dec 09 04:56:15 crc kubenswrapper[4766]: I1209 04:56:15.847617 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d55579bb5-sbh89" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon" containerID="cri-o://b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0" gracePeriod=30 Dec 09 04:56:16 crc kubenswrapper[4766]: I1209 04:56:16.866268 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" path="/var/lib/kubelet/pods/a6f41119-8dff-4af1-9e94-b9ae91eb4f9d/volumes" Dec 09 04:56:16 crc kubenswrapper[4766]: I1209 04:56:16.870006 4766 scope.go:117] "RemoveContainer" containerID="3d5d28d857d873b564683a7f1f1206d89186dfbf7fc1eb282ca9da5cece0ac10" Dec 09 04:56:17 crc kubenswrapper[4766]: I1209 04:56:17.900021 4766 scope.go:117] "RemoveContainer" containerID="5a43adee1b5ea3ea7581e50e83c5ea968a364bd4ecd1692a4c77761a61106755" Dec 09 04:56:17 crc kubenswrapper[4766]: I1209 04:56:17.944390 4766 scope.go:117] "RemoveContainer" containerID="57fccd2c1e9af4b151544b1817b148ea4fb4c4d690f084d9d99a822f0d9a29c3" Dec 09 04:56:18 crc kubenswrapper[4766]: I1209 04:56:18.314001 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-75dd6ffcd6-x4r8x" event={"ID":"b17e303d-fda5-45e4-a409-56c58f010d9e","Type":"ContainerStarted","Data":"8d30b8d2950b9c1bdec5a4d9f22feb9a6552f3bdf62a964f49fd4dac857b6824"} Dec 09 04:56:18 crc kubenswrapper[4766]: I1209 04:56:18.315200 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:18 crc kubenswrapper[4766]: I1209 04:56:18.316753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" event={"ID":"93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac","Type":"ContainerStarted","Data":"1d3297cf87a92a7d84489af116207559cd048a53468a14454daceafdf233eefa"} Dec 09 04:56:18 crc kubenswrapper[4766]: I1209 04:56:18.317342 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:18 crc kubenswrapper[4766]: I1209 04:56:18.334031 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-75dd6ffcd6-x4r8x" podStartSLOduration=1.957894629 podStartE2EDuration="5.334015584s" podCreationTimestamp="2025-12-09 04:56:13 +0000 UTC" firstStartedPulling="2025-12-09 04:56:14.572511948 +0000 UTC m=+6256.281817374" lastFinishedPulling="2025-12-09 04:56:17.948632903 +0000 UTC m=+6259.657938329" observedRunningTime="2025-12-09 04:56:18.332732249 +0000 UTC m=+6260.042037675" watchObservedRunningTime="2025-12-09 04:56:18.334015584 +0000 UTC m=+6260.043321000" Dec 09 04:56:18 crc kubenswrapper[4766]: I1209 04:56:18.365739 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" podStartSLOduration=2.190438037 podStartE2EDuration="5.365722091s" podCreationTimestamp="2025-12-09 04:56:13 +0000 UTC" firstStartedPulling="2025-12-09 04:56:14.7708065 +0000 UTC m=+6256.480111926" lastFinishedPulling="2025-12-09 04:56:17.946090554 +0000 UTC m=+6259.655395980" observedRunningTime="2025-12-09 04:56:18.35755398 +0000 UTC m=+6260.066859416" watchObservedRunningTime="2025-12-09 04:56:18.365722091 +0000 UTC m=+6260.075027517" Dec 09 04:56:19 crc kubenswrapper[4766]: I1209 04:56:19.329109 4766 generic.go:334] "Generic (PLEG): container finished" podID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerID="b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0" exitCode=0 Dec 09 04:56:19 crc kubenswrapper[4766]: I1209 04:56:19.329163 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d55579bb5-sbh89" event={"ID":"e188b94d-17b3-4d2a-b03d-ef25ad951471","Type":"ContainerDied","Data":"b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0"} Dec 09 04:56:20 crc kubenswrapper[4766]: I1209 04:56:20.032642 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d55579bb5-sbh89" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Dec 09 04:56:25 crc kubenswrapper[4766]: I1209 04:56:25.109245 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-75dd6ffcd6-x4r8x" Dec 09 04:56:25 crc kubenswrapper[4766]: I1209 04:56:25.368277 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-9dcdf99c4-hcdzh" Dec 09 04:56:30 crc kubenswrapper[4766]: I1209 04:56:30.033033 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d55579bb5-sbh89" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Dec 09 04:56:32 crc kubenswrapper[4766]: I1209 04:56:32.098392 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jhtfz"] Dec 09 04:56:32 crc kubenswrapper[4766]: I1209 04:56:32.109262 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-30f3-account-create-update-gjmrm"] Dec 09 04:56:32 crc kubenswrapper[4766]: I1209 04:56:32.119672 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jhtfz"] Dec 09 04:56:32 crc kubenswrapper[4766]: I1209 04:56:32.128225 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-30f3-account-create-update-gjmrm"] Dec 09 04:56:32 crc kubenswrapper[4766]: I1209 04:56:32.857303 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e3cb19-20a7-4b09-8c06-dd804f95f4c6" path="/var/lib/kubelet/pods/87e3cb19-20a7-4b09-8c06-dd804f95f4c6/volumes" Dec 09 04:56:32 crc kubenswrapper[4766]: I1209 04:56:32.858898 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8" path="/var/lib/kubelet/pods/f70c5cd4-d08a-4d42-8b8d-fd42a204d6a8/volumes" Dec 09 04:56:34 crc kubenswrapper[4766]: I1209 04:56:34.115329 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-674c694cd8-x85ld" Dec 09 04:56:37 crc kubenswrapper[4766]: I1209 04:56:37.316483 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:56:37 crc kubenswrapper[4766]: I1209 04:56:37.316559 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:56:40 crc kubenswrapper[4766]: I1209 04:56:40.033080 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d55579bb5-sbh89" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Dec 09 04:56:40 crc kubenswrapper[4766]: I1209 04:56:40.033647 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:56:41 crc kubenswrapper[4766]: I1209 04:56:41.056154 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-kgh24"] Dec 09 04:56:41 crc kubenswrapper[4766]: I1209 04:56:41.076921 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-kgh24"] Dec 09 04:56:42 crc kubenswrapper[4766]: I1209 04:56:42.852295 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f19d25-4821-4122-9149-1a479747837a" path="/var/lib/kubelet/pods/d0f19d25-4821-4122-9149-1a479747837a/volumes" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.373816 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.472360 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-scripts\") pod \"e188b94d-17b3-4d2a-b03d-ef25ad951471\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.472483 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e188b94d-17b3-4d2a-b03d-ef25ad951471-logs\") pod \"e188b94d-17b3-4d2a-b03d-ef25ad951471\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.472546 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxnnd\" (UniqueName: \"kubernetes.io/projected/e188b94d-17b3-4d2a-b03d-ef25ad951471-kube-api-access-pxnnd\") pod \"e188b94d-17b3-4d2a-b03d-ef25ad951471\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.472644 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e188b94d-17b3-4d2a-b03d-ef25ad951471-horizon-secret-key\") pod \"e188b94d-17b3-4d2a-b03d-ef25ad951471\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.472748 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-config-data\") pod \"e188b94d-17b3-4d2a-b03d-ef25ad951471\" (UID: \"e188b94d-17b3-4d2a-b03d-ef25ad951471\") " Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.473660 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e188b94d-17b3-4d2a-b03d-ef25ad951471-logs" (OuterVolumeSpecName: "logs") pod "e188b94d-17b3-4d2a-b03d-ef25ad951471" (UID: "e188b94d-17b3-4d2a-b03d-ef25ad951471"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.473832 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e188b94d-17b3-4d2a-b03d-ef25ad951471-logs\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.479651 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e188b94d-17b3-4d2a-b03d-ef25ad951471-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e188b94d-17b3-4d2a-b03d-ef25ad951471" (UID: "e188b94d-17b3-4d2a-b03d-ef25ad951471"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.479899 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e188b94d-17b3-4d2a-b03d-ef25ad951471-kube-api-access-pxnnd" (OuterVolumeSpecName: "kube-api-access-pxnnd") pod "e188b94d-17b3-4d2a-b03d-ef25ad951471" (UID: "e188b94d-17b3-4d2a-b03d-ef25ad951471"). InnerVolumeSpecName "kube-api-access-pxnnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.500538 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-config-data" (OuterVolumeSpecName: "config-data") pod "e188b94d-17b3-4d2a-b03d-ef25ad951471" (UID: "e188b94d-17b3-4d2a-b03d-ef25ad951471"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.512352 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-scripts" (OuterVolumeSpecName: "scripts") pod "e188b94d-17b3-4d2a-b03d-ef25ad951471" (UID: "e188b94d-17b3-4d2a-b03d-ef25ad951471"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.576096 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.576131 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxnnd\" (UniqueName: \"kubernetes.io/projected/e188b94d-17b3-4d2a-b03d-ef25ad951471-kube-api-access-pxnnd\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.576142 4766 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e188b94d-17b3-4d2a-b03d-ef25ad951471-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.576152 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e188b94d-17b3-4d2a-b03d-ef25ad951471-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.726331 4766 generic.go:334] "Generic (PLEG): container finished" podID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerID="72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896" exitCode=137 Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.726427 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d55579bb5-sbh89" event={"ID":"e188b94d-17b3-4d2a-b03d-ef25ad951471","Type":"ContainerDied","Data":"72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896"} Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.726464 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d55579bb5-sbh89" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.726496 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d55579bb5-sbh89" event={"ID":"e188b94d-17b3-4d2a-b03d-ef25ad951471","Type":"ContainerDied","Data":"c3c6bec4ffafbb3c686e5a08ad20ad791c0e48620c09d65a91e93f5d5cecd376"} Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.726525 4766 scope.go:117] "RemoveContainer" containerID="b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.779253 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d55579bb5-sbh89"] Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.787890 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d55579bb5-sbh89"] Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.871253 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" path="/var/lib/kubelet/pods/e188b94d-17b3-4d2a-b03d-ef25ad951471/volumes" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.913962 4766 scope.go:117] "RemoveContainer" containerID="72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.933284 4766 scope.go:117] "RemoveContainer" containerID="b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0" Dec 09 04:56:46 crc kubenswrapper[4766]: E1209 04:56:46.933685 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0\": container with ID starting with b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0 not found: ID does not exist" containerID="b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.933764 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0"} err="failed to get container status \"b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0\": rpc error: code = NotFound desc = could not find container \"b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0\": container with ID starting with b701954e8fb5baab514392fb5d150ae4a998a15a81c2a2fd8297faeef9861bd0 not found: ID does not exist" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.933792 4766 scope.go:117] "RemoveContainer" containerID="72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896" Dec 09 04:56:46 crc kubenswrapper[4766]: E1209 04:56:46.934077 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896\": container with ID starting with 72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896 not found: ID does not exist" containerID="72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896" Dec 09 04:56:46 crc kubenswrapper[4766]: I1209 04:56:46.934124 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896"} err="failed to get container status \"72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896\": rpc error: code = NotFound desc = could not find container \"72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896\": container with ID starting with 72052c629dcaa12ad072bcdf253c3fe2ceb42536ae4bc9816314cb543e490896 not found: ID does not exist" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.411693 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg"] Dec 09 04:56:52 crc kubenswrapper[4766]: E1209 04:56:52.412600 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.412615 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon" Dec 09 04:56:52 crc kubenswrapper[4766]: E1209 04:56:52.412634 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon-log" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.412640 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon-log" Dec 09 04:56:52 crc kubenswrapper[4766]: E1209 04:56:52.412660 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon-log" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.412666 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon-log" Dec 09 04:56:52 crc kubenswrapper[4766]: E1209 04:56:52.412693 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.412699 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.412888 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon-log" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.412905 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.412912 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f41119-8dff-4af1-9e94-b9ae91eb4f9d" containerName="horizon-log" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.412934 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e188b94d-17b3-4d2a-b03d-ef25ad951471" containerName="horizon" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.414703 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.423944 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.429933 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg"] Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.497686 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.497775 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkrmv\" (UniqueName: \"kubernetes.io/projected/ce45094b-177d-4065-a7c0-46a656359ed4-kube-api-access-dkrmv\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.497908 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.600081 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkrmv\" (UniqueName: \"kubernetes.io/projected/ce45094b-177d-4065-a7c0-46a656359ed4-kube-api-access-dkrmv\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.600175 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.600512 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.601359 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.601433 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.633622 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkrmv\" (UniqueName: \"kubernetes.io/projected/ce45094b-177d-4065-a7c0-46a656359ed4-kube-api-access-dkrmv\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:52 crc kubenswrapper[4766]: I1209 04:56:52.746554 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:53 crc kubenswrapper[4766]: W1209 04:56:53.344887 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce45094b_177d_4065_a7c0_46a656359ed4.slice/crio-6576313bd8f916ec7d6c14f7423c1e617c64bb38fd53564f014eaef4e32f8c1c WatchSource:0}: Error finding container 6576313bd8f916ec7d6c14f7423c1e617c64bb38fd53564f014eaef4e32f8c1c: Status 404 returned error can't find the container with id 6576313bd8f916ec7d6c14f7423c1e617c64bb38fd53564f014eaef4e32f8c1c Dec 09 04:56:53 crc kubenswrapper[4766]: I1209 04:56:53.347747 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg"] Dec 09 04:56:53 crc kubenswrapper[4766]: I1209 04:56:53.802309 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" event={"ID":"ce45094b-177d-4065-a7c0-46a656359ed4","Type":"ContainerStarted","Data":"8e6c024c51c5b449f8e459770c7e75f1f137e58b7c484903b5cd324453f74ddc"} Dec 09 04:56:53 crc kubenswrapper[4766]: I1209 04:56:53.802810 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" event={"ID":"ce45094b-177d-4065-a7c0-46a656359ed4","Type":"ContainerStarted","Data":"6576313bd8f916ec7d6c14f7423c1e617c64bb38fd53564f014eaef4e32f8c1c"} Dec 09 04:56:54 crc kubenswrapper[4766]: I1209 04:56:54.824096 4766 generic.go:334] "Generic (PLEG): container finished" podID="ce45094b-177d-4065-a7c0-46a656359ed4" containerID="8e6c024c51c5b449f8e459770c7e75f1f137e58b7c484903b5cd324453f74ddc" exitCode=0 Dec 09 04:56:54 crc kubenswrapper[4766]: I1209 04:56:54.824196 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" event={"ID":"ce45094b-177d-4065-a7c0-46a656359ed4","Type":"ContainerDied","Data":"8e6c024c51c5b449f8e459770c7e75f1f137e58b7c484903b5cd324453f74ddc"} Dec 09 04:56:56 crc kubenswrapper[4766]: I1209 04:56:56.857433 4766 generic.go:334] "Generic (PLEG): container finished" podID="ce45094b-177d-4065-a7c0-46a656359ed4" containerID="2691caefa6e1343a3e05f08d3c7acbbb35090a0083825e0d4bc41ed9cf0ffaf0" exitCode=0 Dec 09 04:56:56 crc kubenswrapper[4766]: I1209 04:56:56.876195 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" event={"ID":"ce45094b-177d-4065-a7c0-46a656359ed4","Type":"ContainerDied","Data":"2691caefa6e1343a3e05f08d3c7acbbb35090a0083825e0d4bc41ed9cf0ffaf0"} Dec 09 04:56:57 crc kubenswrapper[4766]: I1209 04:56:57.873613 4766 generic.go:334] "Generic (PLEG): container finished" podID="ce45094b-177d-4065-a7c0-46a656359ed4" containerID="a692847671dd4a7a6aa9d52e161db978afdac5b2b8a98b8c784c04b092c9676b" exitCode=0 Dec 09 04:56:57 crc kubenswrapper[4766]: I1209 04:56:57.874326 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" event={"ID":"ce45094b-177d-4065-a7c0-46a656359ed4","Type":"ContainerDied","Data":"a692847671dd4a7a6aa9d52e161db978afdac5b2b8a98b8c784c04b092c9676b"} Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.328630 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.462374 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-util\") pod \"ce45094b-177d-4065-a7c0-46a656359ed4\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.462488 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-bundle\") pod \"ce45094b-177d-4065-a7c0-46a656359ed4\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.462690 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkrmv\" (UniqueName: \"kubernetes.io/projected/ce45094b-177d-4065-a7c0-46a656359ed4-kube-api-access-dkrmv\") pod \"ce45094b-177d-4065-a7c0-46a656359ed4\" (UID: \"ce45094b-177d-4065-a7c0-46a656359ed4\") " Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.465410 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-bundle" (OuterVolumeSpecName: "bundle") pod "ce45094b-177d-4065-a7c0-46a656359ed4" (UID: "ce45094b-177d-4065-a7c0-46a656359ed4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.472912 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce45094b-177d-4065-a7c0-46a656359ed4-kube-api-access-dkrmv" (OuterVolumeSpecName: "kube-api-access-dkrmv") pod "ce45094b-177d-4065-a7c0-46a656359ed4" (UID: "ce45094b-177d-4065-a7c0-46a656359ed4"). InnerVolumeSpecName "kube-api-access-dkrmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.485869 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-util" (OuterVolumeSpecName: "util") pod "ce45094b-177d-4065-a7c0-46a656359ed4" (UID: "ce45094b-177d-4065-a7c0-46a656359ed4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.565885 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-util\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.565934 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce45094b-177d-4065-a7c0-46a656359ed4-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.565954 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkrmv\" (UniqueName: \"kubernetes.io/projected/ce45094b-177d-4065-a7c0-46a656359ed4-kube-api-access-dkrmv\") on node \"crc\" DevicePath \"\"" Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.900587 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" event={"ID":"ce45094b-177d-4065-a7c0-46a656359ed4","Type":"ContainerDied","Data":"6576313bd8f916ec7d6c14f7423c1e617c64bb38fd53564f014eaef4e32f8c1c"} Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.900931 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6576313bd8f916ec7d6c14f7423c1e617c64bb38fd53564f014eaef4e32f8c1c" Dec 09 04:56:59 crc kubenswrapper[4766]: I1209 04:56:59.900703 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg" Dec 09 04:57:07 crc kubenswrapper[4766]: I1209 04:57:07.316112 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:57:07 crc kubenswrapper[4766]: I1209 04:57:07.316608 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.067675 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5pk2z"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.084351 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ce1d-account-create-update-5wqcp"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.092720 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ce1d-account-create-update-5wqcp"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.106396 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5pk2z"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.116351 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r"] Dec 09 04:57:11 crc kubenswrapper[4766]: E1209 04:57:11.116813 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce45094b-177d-4065-a7c0-46a656359ed4" containerName="extract" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.116829 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce45094b-177d-4065-a7c0-46a656359ed4" containerName="extract" Dec 09 04:57:11 crc kubenswrapper[4766]: E1209 04:57:11.116866 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce45094b-177d-4065-a7c0-46a656359ed4" containerName="util" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.116884 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce45094b-177d-4065-a7c0-46a656359ed4" containerName="util" Dec 09 04:57:11 crc kubenswrapper[4766]: E1209 04:57:11.116894 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce45094b-177d-4065-a7c0-46a656359ed4" containerName="pull" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.116900 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce45094b-177d-4065-a7c0-46a656359ed4" containerName="pull" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.117081 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce45094b-177d-4065-a7c0-46a656359ed4" containerName="extract" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.117773 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.122291 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rvwmd" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.123175 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.123504 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.125400 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.195985 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.197467 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.199679 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-msvrd" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.200088 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.205875 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.207191 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.224320 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.249190 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zn5c\" (UniqueName: \"kubernetes.io/projected/351744ca-7f25-47b3-b00e-f7bdaf6f1693-kube-api-access-5zn5c\") pod \"obo-prometheus-operator-668cf9dfbb-h8f7r\" (UID: \"351744ca-7f25-47b3-b00e-f7bdaf6f1693\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.257487 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.351337 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c5e8ca0-1744-4ea2-8e81-dab7828b617a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc\" (UID: \"4c5e8ca0-1744-4ea2-8e81-dab7828b617a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.351389 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16f6a81d-c16a-43d8-8bd0-d58756bb2605-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg\" (UID: \"16f6a81d-c16a-43d8-8bd0-d58756bb2605\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.351430 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16f6a81d-c16a-43d8-8bd0-d58756bb2605-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg\" (UID: \"16f6a81d-c16a-43d8-8bd0-d58756bb2605\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.351850 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5e8ca0-1744-4ea2-8e81-dab7828b617a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc\" (UID: \"4c5e8ca0-1744-4ea2-8e81-dab7828b617a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.351917 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zn5c\" (UniqueName: \"kubernetes.io/projected/351744ca-7f25-47b3-b00e-f7bdaf6f1693-kube-api-access-5zn5c\") pod \"obo-prometheus-operator-668cf9dfbb-h8f7r\" (UID: \"351744ca-7f25-47b3-b00e-f7bdaf6f1693\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.371447 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zn5c\" (UniqueName: \"kubernetes.io/projected/351744ca-7f25-47b3-b00e-f7bdaf6f1693-kube-api-access-5zn5c\") pod \"obo-prometheus-operator-668cf9dfbb-h8f7r\" (UID: \"351744ca-7f25-47b3-b00e-f7bdaf6f1693\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.417356 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-zrnmc"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.418826 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.421307 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-zc4qq" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.421934 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.441827 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-zrnmc"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.447453 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.453637 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5e8ca0-1744-4ea2-8e81-dab7828b617a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc\" (UID: \"4c5e8ca0-1744-4ea2-8e81-dab7828b617a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.453737 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c5e8ca0-1744-4ea2-8e81-dab7828b617a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc\" (UID: \"4c5e8ca0-1744-4ea2-8e81-dab7828b617a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.453770 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16f6a81d-c16a-43d8-8bd0-d58756bb2605-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg\" (UID: \"16f6a81d-c16a-43d8-8bd0-d58756bb2605\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.453802 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16f6a81d-c16a-43d8-8bd0-d58756bb2605-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg\" (UID: \"16f6a81d-c16a-43d8-8bd0-d58756bb2605\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.458888 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16f6a81d-c16a-43d8-8bd0-d58756bb2605-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg\" (UID: \"16f6a81d-c16a-43d8-8bd0-d58756bb2605\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.460925 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c5e8ca0-1744-4ea2-8e81-dab7828b617a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc\" (UID: \"4c5e8ca0-1744-4ea2-8e81-dab7828b617a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.465806 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16f6a81d-c16a-43d8-8bd0-d58756bb2605-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg\" (UID: \"16f6a81d-c16a-43d8-8bd0-d58756bb2605\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.485460 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5e8ca0-1744-4ea2-8e81-dab7828b617a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc\" (UID: \"4c5e8ca0-1744-4ea2-8e81-dab7828b617a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.525874 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.558671 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.559322 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wpw9\" (UniqueName: \"kubernetes.io/projected/ca072608-0822-46da-a58f-e6544233b091-kube-api-access-5wpw9\") pod \"observability-operator-d8bb48f5d-zrnmc\" (UID: \"ca072608-0822-46da-a58f-e6544233b091\") " pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.559414 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca072608-0822-46da-a58f-e6544233b091-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-zrnmc\" (UID: \"ca072608-0822-46da-a58f-e6544233b091\") " pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.627814 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-9smnr"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.629151 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.636182 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-jzk7r" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.667574 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wpw9\" (UniqueName: \"kubernetes.io/projected/ca072608-0822-46da-a58f-e6544233b091-kube-api-access-5wpw9\") pod \"observability-operator-d8bb48f5d-zrnmc\" (UID: \"ca072608-0822-46da-a58f-e6544233b091\") " pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.667673 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca072608-0822-46da-a58f-e6544233b091-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-zrnmc\" (UID: \"ca072608-0822-46da-a58f-e6544233b091\") " pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.676484 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-9smnr"] Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.698082 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca072608-0822-46da-a58f-e6544233b091-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-zrnmc\" (UID: \"ca072608-0822-46da-a58f-e6544233b091\") " pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.708683 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wpw9\" (UniqueName: \"kubernetes.io/projected/ca072608-0822-46da-a58f-e6544233b091-kube-api-access-5wpw9\") pod \"observability-operator-d8bb48f5d-zrnmc\" (UID: \"ca072608-0822-46da-a58f-e6544233b091\") " pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.746087 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.769537 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttwt7\" (UniqueName: \"kubernetes.io/projected/a8b37c0c-fe68-4f18-a5df-43d3221934a2-kube-api-access-ttwt7\") pod \"perses-operator-5446b9c989-9smnr\" (UID: \"a8b37c0c-fe68-4f18-a5df-43d3221934a2\") " pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.770081 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a8b37c0c-fe68-4f18-a5df-43d3221934a2-openshift-service-ca\") pod \"perses-operator-5446b9c989-9smnr\" (UID: \"a8b37c0c-fe68-4f18-a5df-43d3221934a2\") " pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.872394 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a8b37c0c-fe68-4f18-a5df-43d3221934a2-openshift-service-ca\") pod \"perses-operator-5446b9c989-9smnr\" (UID: \"a8b37c0c-fe68-4f18-a5df-43d3221934a2\") " pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.873280 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a8b37c0c-fe68-4f18-a5df-43d3221934a2-openshift-service-ca\") pod \"perses-operator-5446b9c989-9smnr\" (UID: \"a8b37c0c-fe68-4f18-a5df-43d3221934a2\") " pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.873426 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttwt7\" (UniqueName: \"kubernetes.io/projected/a8b37c0c-fe68-4f18-a5df-43d3221934a2-kube-api-access-ttwt7\") pod \"perses-operator-5446b9c989-9smnr\" (UID: \"a8b37c0c-fe68-4f18-a5df-43d3221934a2\") " pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:11 crc kubenswrapper[4766]: I1209 04:57:11.898270 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttwt7\" (UniqueName: \"kubernetes.io/projected/a8b37c0c-fe68-4f18-a5df-43d3221934a2-kube-api-access-ttwt7\") pod \"perses-operator-5446b9c989-9smnr\" (UID: \"a8b37c0c-fe68-4f18-a5df-43d3221934a2\") " pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:12 crc kubenswrapper[4766]: I1209 04:57:12.012183 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:12 crc kubenswrapper[4766]: I1209 04:57:12.046050 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r"] Dec 09 04:57:12 crc kubenswrapper[4766]: I1209 04:57:12.214360 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg"] Dec 09 04:57:12 crc kubenswrapper[4766]: W1209 04:57:12.243411 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f6a81d_c16a_43d8_8bd0_d58756bb2605.slice/crio-5751d8222a769b8ff6d5051866ff25d775469eb6901c5c713dedcc2039b6e235 WatchSource:0}: Error finding container 5751d8222a769b8ff6d5051866ff25d775469eb6901c5c713dedcc2039b6e235: Status 404 returned error can't find the container with id 5751d8222a769b8ff6d5051866ff25d775469eb6901c5c713dedcc2039b6e235 Dec 09 04:57:12 crc kubenswrapper[4766]: I1209 04:57:12.353949 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc"] Dec 09 04:57:12 crc kubenswrapper[4766]: I1209 04:57:12.453147 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-zrnmc"] Dec 09 04:57:12 crc kubenswrapper[4766]: I1209 04:57:12.638660 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-9smnr"] Dec 09 04:57:12 crc kubenswrapper[4766]: W1209 04:57:12.643030 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b37c0c_fe68_4f18_a5df_43d3221934a2.slice/crio-8f3a75e5f1f5af381a587e23fc8ba54ea9234af7a9efe3313c6f8f68eeb4412f WatchSource:0}: Error finding container 8f3a75e5f1f5af381a587e23fc8ba54ea9234af7a9efe3313c6f8f68eeb4412f: Status 404 returned error can't find the container with id 8f3a75e5f1f5af381a587e23fc8ba54ea9234af7a9efe3313c6f8f68eeb4412f Dec 09 04:57:12 crc kubenswrapper[4766]: I1209 04:57:12.849594 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4710023f-0f31-45ac-b1bb-f6dd3b84704a" path="/var/lib/kubelet/pods/4710023f-0f31-45ac-b1bb-f6dd3b84704a/volumes" Dec 09 04:57:12 crc kubenswrapper[4766]: I1209 04:57:12.850786 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5abc385-6494-4563-968a-124a07d7b1e9" path="/var/lib/kubelet/pods/a5abc385-6494-4563-968a-124a07d7b1e9/volumes" Dec 09 04:57:13 crc kubenswrapper[4766]: I1209 04:57:13.021606 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-9smnr" event={"ID":"a8b37c0c-fe68-4f18-a5df-43d3221934a2","Type":"ContainerStarted","Data":"8f3a75e5f1f5af381a587e23fc8ba54ea9234af7a9efe3313c6f8f68eeb4412f"} Dec 09 04:57:13 crc kubenswrapper[4766]: I1209 04:57:13.023757 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" event={"ID":"4c5e8ca0-1744-4ea2-8e81-dab7828b617a","Type":"ContainerStarted","Data":"9e3d6f6352d57ee88cd37f46b5a97e994322fcde6569880301fe8e0351307b50"} Dec 09 04:57:13 crc kubenswrapper[4766]: I1209 04:57:13.027414 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" event={"ID":"ca072608-0822-46da-a58f-e6544233b091","Type":"ContainerStarted","Data":"a01b6eb44587db1eec7e5d08b8c4f22c178f8352c1e2b3c53d4f6a52bde94a3a"} Dec 09 04:57:13 crc kubenswrapper[4766]: I1209 04:57:13.041629 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r" event={"ID":"351744ca-7f25-47b3-b00e-f7bdaf6f1693","Type":"ContainerStarted","Data":"40be382fedda6acf5aff517604b07c07b9774b4658d586df3e42c60cd579080c"} Dec 09 04:57:13 crc kubenswrapper[4766]: I1209 04:57:13.043135 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" event={"ID":"16f6a81d-c16a-43d8-8bd0-d58756bb2605","Type":"ContainerStarted","Data":"5751d8222a769b8ff6d5051866ff25d775469eb6901c5c713dedcc2039b6e235"} Dec 09 04:57:16 crc kubenswrapper[4766]: I1209 04:57:16.029297 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2c8w2"] Dec 09 04:57:16 crc kubenswrapper[4766]: I1209 04:57:16.037692 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2c8w2"] Dec 09 04:57:16 crc kubenswrapper[4766]: I1209 04:57:16.849874 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4" path="/var/lib/kubelet/pods/c2c888aa-7ec3-4e63-8d79-dc668fd0d5e4/volumes" Dec 09 04:57:18 crc kubenswrapper[4766]: I1209 04:57:18.305677 4766 scope.go:117] "RemoveContainer" containerID="ac46f9d70c6436d0e62a8804cf61fc36234c3e971c7f31a576ef218d3a0903b5" Dec 09 04:57:18 crc kubenswrapper[4766]: I1209 04:57:18.970446 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ml5b7"] Dec 09 04:57:18 crc kubenswrapper[4766]: I1209 04:57:18.978499 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:18 crc kubenswrapper[4766]: I1209 04:57:18.994599 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ml5b7"] Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.098451 4766 scope.go:117] "RemoveContainer" containerID="34554c6730d06ad95e63a9244cbcac645d21c7d11356346774cc0c03fb9927b5" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.102372 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-catalog-content\") pod \"community-operators-ml5b7\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.102425 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-utilities\") pod \"community-operators-ml5b7\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.102534 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7thmf\" (UniqueName: \"kubernetes.io/projected/8f631587-0e09-4a40-97a5-83e97c1c2242-kube-api-access-7thmf\") pod \"community-operators-ml5b7\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.204740 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7thmf\" (UniqueName: \"kubernetes.io/projected/8f631587-0e09-4a40-97a5-83e97c1c2242-kube-api-access-7thmf\") pod \"community-operators-ml5b7\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.204927 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-catalog-content\") pod \"community-operators-ml5b7\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.204962 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-utilities\") pod \"community-operators-ml5b7\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.205845 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-utilities\") pod \"community-operators-ml5b7\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.206421 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-catalog-content\") pod \"community-operators-ml5b7\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.254159 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7thmf\" (UniqueName: \"kubernetes.io/projected/8f631587-0e09-4a40-97a5-83e97c1c2242-kube-api-access-7thmf\") pod \"community-operators-ml5b7\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:19 crc kubenswrapper[4766]: I1209 04:57:19.351510 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:22 crc kubenswrapper[4766]: I1209 04:57:22.336849 4766 scope.go:117] "RemoveContainer" containerID="87363e8fe0febdcef5dc8f33554b29a6a4781810e413f9fe4e899fb0962bf53f" Dec 09 04:57:22 crc kubenswrapper[4766]: I1209 04:57:22.421462 4766 scope.go:117] "RemoveContainer" containerID="a7ee0eafd72377b2e560719262daa08f31290c148cfffda275689057d7aa4b36" Dec 09 04:57:22 crc kubenswrapper[4766]: I1209 04:57:22.584721 4766 scope.go:117] "RemoveContainer" containerID="df54a1606f35d6b2cd3a23d7de1d52b269150e4e33b02655dae5c63ebc86c17d" Dec 09 04:57:22 crc kubenswrapper[4766]: I1209 04:57:22.717875 4766 scope.go:117] "RemoveContainer" containerID="fdc245bf65be30f0eab7a2e26fcfe3455a04d5c788a2e88ef8b4a8ad9ffc20a9" Dec 09 04:57:22 crc kubenswrapper[4766]: I1209 04:57:22.910965 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ml5b7"] Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.175826 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" event={"ID":"16f6a81d-c16a-43d8-8bd0-d58756bb2605","Type":"ContainerStarted","Data":"9c2707c7302818d5eb40c73632b809d8402cd0d46cf4e05bc4eb507c87118885"} Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.177448 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" event={"ID":"4c5e8ca0-1744-4ea2-8e81-dab7828b617a","Type":"ContainerStarted","Data":"0e60da6f9eb494f7b1f6ecfe88f070e3ec2b28b8ed5c0db6bca886639f6f3fe0"} Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.179636 4766 generic.go:334] "Generic (PLEG): container finished" podID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerID="15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20" exitCode=0 Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.179703 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml5b7" event={"ID":"8f631587-0e09-4a40-97a5-83e97c1c2242","Type":"ContainerDied","Data":"15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20"} Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.179724 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml5b7" event={"ID":"8f631587-0e09-4a40-97a5-83e97c1c2242","Type":"ContainerStarted","Data":"84139aca8158efde3900df3beec2832053abd0708b50e2e7998d2246353fd068"} Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.181358 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" event={"ID":"ca072608-0822-46da-a58f-e6544233b091","Type":"ContainerStarted","Data":"e724473e832861744bb1594c2839b8cfa1639b3fc30221cad067f9237335a911"} Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.181549 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.183181 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r" event={"ID":"351744ca-7f25-47b3-b00e-f7bdaf6f1693","Type":"ContainerStarted","Data":"21d37271d7e8d0e90d5e640a25b0e1e4237e62b444e79193d2941a70555c8cf1"} Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.184837 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-9smnr" event={"ID":"a8b37c0c-fe68-4f18-a5df-43d3221934a2","Type":"ContainerStarted","Data":"7d9127d7b00931550df4a94f7144111ea9ea73e30ad8dfffa73b154020f34e69"} Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.184962 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.195033 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.215620 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg" podStartSLOduration=2.147023984 podStartE2EDuration="12.215598472s" podCreationTimestamp="2025-12-09 04:57:11 +0000 UTC" firstStartedPulling="2025-12-09 04:57:12.268157398 +0000 UTC m=+6313.977462824" lastFinishedPulling="2025-12-09 04:57:22.336731886 +0000 UTC m=+6324.046037312" observedRunningTime="2025-12-09 04:57:23.207078412 +0000 UTC m=+6324.916383858" watchObservedRunningTime="2025-12-09 04:57:23.215598472 +0000 UTC m=+6324.924903898" Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.230451 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-9smnr" podStartSLOduration=2.498092306 podStartE2EDuration="12.230431983s" podCreationTimestamp="2025-12-09 04:57:11 +0000 UTC" firstStartedPulling="2025-12-09 04:57:12.64468012 +0000 UTC m=+6314.353985556" lastFinishedPulling="2025-12-09 04:57:22.377019807 +0000 UTC m=+6324.086325233" observedRunningTime="2025-12-09 04:57:23.229780596 +0000 UTC m=+6324.939086022" watchObservedRunningTime="2025-12-09 04:57:23.230431983 +0000 UTC m=+6324.939737409" Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.285161 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h8f7r" podStartSLOduration=1.9928068639999998 podStartE2EDuration="12.285142783s" podCreationTimestamp="2025-12-09 04:57:11 +0000 UTC" firstStartedPulling="2025-12-09 04:57:12.083541416 +0000 UTC m=+6313.792846842" lastFinishedPulling="2025-12-09 04:57:22.375877285 +0000 UTC m=+6324.085182761" observedRunningTime="2025-12-09 04:57:23.278590516 +0000 UTC m=+6324.987895952" watchObservedRunningTime="2025-12-09 04:57:23.285142783 +0000 UTC m=+6324.994448209" Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.329578 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc" podStartSLOduration=2.32990936 podStartE2EDuration="12.329562295s" podCreationTimestamp="2025-12-09 04:57:11 +0000 UTC" firstStartedPulling="2025-12-09 04:57:12.376178059 +0000 UTC m=+6314.085483485" lastFinishedPulling="2025-12-09 04:57:22.375830994 +0000 UTC m=+6324.085136420" observedRunningTime="2025-12-09 04:57:23.32384576 +0000 UTC m=+6325.033151186" watchObservedRunningTime="2025-12-09 04:57:23.329562295 +0000 UTC m=+6325.038867721" Dec 09 04:57:23 crc kubenswrapper[4766]: I1209 04:57:23.351015 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-zrnmc" podStartSLOduration=2.388078224 podStartE2EDuration="12.350996285s" podCreationTimestamp="2025-12-09 04:57:11 +0000 UTC" firstStartedPulling="2025-12-09 04:57:12.465749662 +0000 UTC m=+6314.175055088" lastFinishedPulling="2025-12-09 04:57:22.428667723 +0000 UTC m=+6324.137973149" observedRunningTime="2025-12-09 04:57:23.348898498 +0000 UTC m=+6325.058203924" watchObservedRunningTime="2025-12-09 04:57:23.350996285 +0000 UTC m=+6325.060301711" Dec 09 04:57:24 crc kubenswrapper[4766]: I1209 04:57:24.193727 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml5b7" event={"ID":"8f631587-0e09-4a40-97a5-83e97c1c2242","Type":"ContainerStarted","Data":"8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e"} Dec 09 04:57:26 crc kubenswrapper[4766]: I1209 04:57:26.211316 4766 generic.go:334] "Generic (PLEG): container finished" podID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerID="8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e" exitCode=0 Dec 09 04:57:26 crc kubenswrapper[4766]: I1209 04:57:26.211398 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml5b7" event={"ID":"8f631587-0e09-4a40-97a5-83e97c1c2242","Type":"ContainerDied","Data":"8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e"} Dec 09 04:57:27 crc kubenswrapper[4766]: I1209 04:57:27.222396 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml5b7" event={"ID":"8f631587-0e09-4a40-97a5-83e97c1c2242","Type":"ContainerStarted","Data":"1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448"} Dec 09 04:57:27 crc kubenswrapper[4766]: I1209 04:57:27.254385 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ml5b7" podStartSLOduration=5.812812702 podStartE2EDuration="9.254368717s" podCreationTimestamp="2025-12-09 04:57:18 +0000 UTC" firstStartedPulling="2025-12-09 04:57:23.181341916 +0000 UTC m=+6324.890647342" lastFinishedPulling="2025-12-09 04:57:26.622897911 +0000 UTC m=+6328.332203357" observedRunningTime="2025-12-09 04:57:27.242332872 +0000 UTC m=+6328.951638288" watchObservedRunningTime="2025-12-09 04:57:27.254368717 +0000 UTC m=+6328.963674143" Dec 09 04:57:29 crc kubenswrapper[4766]: I1209 04:57:29.351949 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:29 crc kubenswrapper[4766]: I1209 04:57:29.352584 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:30 crc kubenswrapper[4766]: I1209 04:57:30.432581 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ml5b7" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerName="registry-server" probeResult="failure" output=< Dec 09 04:57:30 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 04:57:30 crc kubenswrapper[4766]: > Dec 09 04:57:32 crc kubenswrapper[4766]: I1209 04:57:32.016524 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-9smnr" Dec 09 04:57:34 crc kubenswrapper[4766]: I1209 04:57:34.909226 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 09 04:57:34 crc kubenswrapper[4766]: I1209 04:57:34.909823 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="1efe8f28-9359-4214-911b-594db568c9a5" containerName="openstackclient" containerID="cri-o://eda57d876cfc768360846060dd21b3ebad99c69d1a82fda52cbc45b7acfaf5b1" gracePeriod=2 Dec 09 04:57:34 crc kubenswrapper[4766]: I1209 04:57:34.919784 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 09 04:57:34 crc kubenswrapper[4766]: I1209 04:57:34.960333 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 09 04:57:34 crc kubenswrapper[4766]: E1209 04:57:34.960809 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efe8f28-9359-4214-911b-594db568c9a5" containerName="openstackclient" Dec 09 04:57:34 crc kubenswrapper[4766]: I1209 04:57:34.960831 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efe8f28-9359-4214-911b-594db568c9a5" containerName="openstackclient" Dec 09 04:57:34 crc kubenswrapper[4766]: I1209 04:57:34.961088 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efe8f28-9359-4214-911b-594db568c9a5" containerName="openstackclient" Dec 09 04:57:34 crc kubenswrapper[4766]: I1209 04:57:34.965951 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.007844 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.012499 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1efe8f28-9359-4214-911b-594db568c9a5" podUID="8d7295db-9761-4829-93af-705e9af64b1c" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.050758 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d7295db-9761-4829-93af-705e9af64b1c-openstack-config\") pod \"openstackclient\" (UID: \"8d7295db-9761-4829-93af-705e9af64b1c\") " pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.051406 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrszp\" (UniqueName: \"kubernetes.io/projected/8d7295db-9761-4829-93af-705e9af64b1c-kube-api-access-mrszp\") pod \"openstackclient\" (UID: \"8d7295db-9761-4829-93af-705e9af64b1c\") " pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.051506 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d7295db-9761-4829-93af-705e9af64b1c-openstack-config-secret\") pod \"openstackclient\" (UID: \"8d7295db-9761-4829-93af-705e9af64b1c\") " pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.127427 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.128780 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.134831 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5d8gw" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.153463 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp8hk\" (UniqueName: \"kubernetes.io/projected/0bbf5934-e11a-4118-afde-57ecc9d95cd0-kube-api-access-sp8hk\") pod \"kube-state-metrics-0\" (UID: \"0bbf5934-e11a-4118-afde-57ecc9d95cd0\") " pod="openstack/kube-state-metrics-0" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.153778 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d7295db-9761-4829-93af-705e9af64b1c-openstack-config\") pod \"openstackclient\" (UID: \"8d7295db-9761-4829-93af-705e9af64b1c\") " pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.153937 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrszp\" (UniqueName: \"kubernetes.io/projected/8d7295db-9761-4829-93af-705e9af64b1c-kube-api-access-mrszp\") pod \"openstackclient\" (UID: \"8d7295db-9761-4829-93af-705e9af64b1c\") " pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.154030 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d7295db-9761-4829-93af-705e9af64b1c-openstack-config-secret\") pod \"openstackclient\" (UID: \"8d7295db-9761-4829-93af-705e9af64b1c\") " pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.155062 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d7295db-9761-4829-93af-705e9af64b1c-openstack-config\") pod \"openstackclient\" (UID: \"8d7295db-9761-4829-93af-705e9af64b1c\") " pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.163939 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d7295db-9761-4829-93af-705e9af64b1c-openstack-config-secret\") pod \"openstackclient\" (UID: \"8d7295db-9761-4829-93af-705e9af64b1c\") " pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.172965 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.210335 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrszp\" (UniqueName: \"kubernetes.io/projected/8d7295db-9761-4829-93af-705e9af64b1c-kube-api-access-mrszp\") pod \"openstackclient\" (UID: \"8d7295db-9761-4829-93af-705e9af64b1c\") " pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.255559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp8hk\" (UniqueName: \"kubernetes.io/projected/0bbf5934-e11a-4118-afde-57ecc9d95cd0-kube-api-access-sp8hk\") pod \"kube-state-metrics-0\" (UID: \"0bbf5934-e11a-4118-afde-57ecc9d95cd0\") " pod="openstack/kube-state-metrics-0" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.297605 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.316921 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp8hk\" (UniqueName: \"kubernetes.io/projected/0bbf5934-e11a-4118-afde-57ecc9d95cd0-kube-api-access-sp8hk\") pod \"kube-state-metrics-0\" (UID: \"0bbf5934-e11a-4118-afde-57ecc9d95cd0\") " pod="openstack/kube-state-metrics-0" Dec 09 04:57:35 crc kubenswrapper[4766]: I1209 04:57:35.454695 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:35.888263 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:35.890824 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:35.904678 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:35.904767 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-hwrj8" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:35.904880 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:35.912575 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:35.912753 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:35.950485 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.085144 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e98929a1-109f-4cbc-91a8-929a852d163b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.085464 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e98929a1-109f-4cbc-91a8-929a852d163b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.085516 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcqb8\" (UniqueName: \"kubernetes.io/projected/e98929a1-109f-4cbc-91a8-929a852d163b-kube-api-access-tcqb8\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.085546 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e98929a1-109f-4cbc-91a8-929a852d163b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.085572 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e98929a1-109f-4cbc-91a8-929a852d163b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.085609 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/e98929a1-109f-4cbc-91a8-929a852d163b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.085632 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e98929a1-109f-4cbc-91a8-929a852d163b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.201084 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e98929a1-109f-4cbc-91a8-929a852d163b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.201154 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e98929a1-109f-4cbc-91a8-929a852d163b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.201238 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcqb8\" (UniqueName: \"kubernetes.io/projected/e98929a1-109f-4cbc-91a8-929a852d163b-kube-api-access-tcqb8\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.201274 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e98929a1-109f-4cbc-91a8-929a852d163b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.201309 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e98929a1-109f-4cbc-91a8-929a852d163b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.201359 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/e98929a1-109f-4cbc-91a8-929a852d163b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.201384 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e98929a1-109f-4cbc-91a8-929a852d163b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.210945 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/e98929a1-109f-4cbc-91a8-929a852d163b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.241447 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e98929a1-109f-4cbc-91a8-929a852d163b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.276356 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e98929a1-109f-4cbc-91a8-929a852d163b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.282576 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e98929a1-109f-4cbc-91a8-929a852d163b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.294024 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e98929a1-109f-4cbc-91a8-929a852d163b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.294462 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e98929a1-109f-4cbc-91a8-929a852d163b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.296200 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcqb8\" (UniqueName: \"kubernetes.io/projected/e98929a1-109f-4cbc-91a8-929a852d163b-kube-api-access-tcqb8\") pod \"alertmanager-metric-storage-0\" (UID: \"e98929a1-109f-4cbc-91a8-929a852d163b\") " pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.551281 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.565289 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.569812 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.570265 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.570542 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-gnkmq" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.570668 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.570834 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.573421 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.585806 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.589981 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.744464 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b1b378fc-a47f-4d0f-9e2e-6d5d47d8f168\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b1b378fc-a47f-4d0f-9e2e-6d5d47d8f168\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.744538 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/375a1ede-7b02-4a66-aed0-02666883a8f2-config\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.744565 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/375a1ede-7b02-4a66-aed0-02666883a8f2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.744584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqkhr\" (UniqueName: \"kubernetes.io/projected/375a1ede-7b02-4a66-aed0-02666883a8f2-kube-api-access-cqkhr\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.744607 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/375a1ede-7b02-4a66-aed0-02666883a8f2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.744633 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/375a1ede-7b02-4a66-aed0-02666883a8f2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.744683 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/375a1ede-7b02-4a66-aed0-02666883a8f2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.744748 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/375a1ede-7b02-4a66-aed0-02666883a8f2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.846439 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b1b378fc-a47f-4d0f-9e2e-6d5d47d8f168\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b1b378fc-a47f-4d0f-9e2e-6d5d47d8f168\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.846522 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/375a1ede-7b02-4a66-aed0-02666883a8f2-config\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.846557 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/375a1ede-7b02-4a66-aed0-02666883a8f2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.846578 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqkhr\" (UniqueName: \"kubernetes.io/projected/375a1ede-7b02-4a66-aed0-02666883a8f2-kube-api-access-cqkhr\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.846597 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/375a1ede-7b02-4a66-aed0-02666883a8f2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.846623 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/375a1ede-7b02-4a66-aed0-02666883a8f2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.846677 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/375a1ede-7b02-4a66-aed0-02666883a8f2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.846761 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/375a1ede-7b02-4a66-aed0-02666883a8f2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.852031 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/375a1ede-7b02-4a66-aed0-02666883a8f2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.853787 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/375a1ede-7b02-4a66-aed0-02666883a8f2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.854288 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/375a1ede-7b02-4a66-aed0-02666883a8f2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.857165 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/375a1ede-7b02-4a66-aed0-02666883a8f2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.858785 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/375a1ede-7b02-4a66-aed0-02666883a8f2-config\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.878472 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/375a1ede-7b02-4a66-aed0-02666883a8f2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.882114 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqkhr\" (UniqueName: \"kubernetes.io/projected/375a1ede-7b02-4a66-aed0-02666883a8f2-kube-api-access-cqkhr\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.951487 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.958087 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 04:57:36 crc kubenswrapper[4766]: I1209 04:57:36.958116 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b1b378fc-a47f-4d0f-9e2e-6d5d47d8f168\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b1b378fc-a47f-4d0f-9e2e-6d5d47d8f168\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81a6e0584c0e69ab0db487d411ad0451ca4031d2b9cb87d2dbd7f40385b94152/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.044193 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b1b378fc-a47f-4d0f-9e2e-6d5d47d8f168\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b1b378fc-a47f-4d0f-9e2e-6d5d47d8f168\") pod \"prometheus-metric-storage-0\" (UID: \"375a1ede-7b02-4a66-aed0-02666883a8f2\") " pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.107718 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.249876 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.318058 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.318120 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.318184 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.319192 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.319343 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" gracePeriod=600 Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.409939 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8d7295db-9761-4829-93af-705e9af64b1c","Type":"ContainerStarted","Data":"1f77c0fb5e3517ccd256d8c0b233b43fae8ea410937dd3e48ff788b04dc5b3d7"} Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.423603 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.427706 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0bbf5934-e11a-4118-afde-57ecc9d95cd0","Type":"ContainerStarted","Data":"6e94dcfaf95a48bfcb0254a1069eb724b91862a83966c883d709ddd3c91d7dc2"} Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.429333 4766 generic.go:334] "Generic (PLEG): container finished" podID="1efe8f28-9359-4214-911b-594db568c9a5" containerID="eda57d876cfc768360846060dd21b3ebad99c69d1a82fda52cbc45b7acfaf5b1" exitCode=137 Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.462478 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:57:37 crc kubenswrapper[4766]: E1209 04:57:37.486516 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.586054 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config\") pod \"1efe8f28-9359-4214-911b-594db568c9a5\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.586563 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5fqq\" (UniqueName: \"kubernetes.io/projected/1efe8f28-9359-4214-911b-594db568c9a5-kube-api-access-z5fqq\") pod \"1efe8f28-9359-4214-911b-594db568c9a5\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.586693 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config-secret\") pod \"1efe8f28-9359-4214-911b-594db568c9a5\" (UID: \"1efe8f28-9359-4214-911b-594db568c9a5\") " Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.594964 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1efe8f28-9359-4214-911b-594db568c9a5-kube-api-access-z5fqq" (OuterVolumeSpecName: "kube-api-access-z5fqq") pod "1efe8f28-9359-4214-911b-594db568c9a5" (UID: "1efe8f28-9359-4214-911b-594db568c9a5"). InnerVolumeSpecName "kube-api-access-z5fqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.615078 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1efe8f28-9359-4214-911b-594db568c9a5" (UID: "1efe8f28-9359-4214-911b-594db568c9a5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.657267 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1efe8f28-9359-4214-911b-594db568c9a5" (UID: "1efe8f28-9359-4214-911b-594db568c9a5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.689735 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5fqq\" (UniqueName: \"kubernetes.io/projected/1efe8f28-9359-4214-911b-594db568c9a5-kube-api-access-z5fqq\") on node \"crc\" DevicePath \"\"" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.689788 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.689810 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1efe8f28-9359-4214-911b-594db568c9a5-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:57:37 crc kubenswrapper[4766]: I1209 04:57:37.819758 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 09 04:57:37 crc kubenswrapper[4766]: W1209 04:57:37.823305 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod375a1ede_7b02_4a66_aed0_02666883a8f2.slice/crio-22fc9af5a279c9148eeeedeab7e195c50330df80fbfd4fbc593390a7c9e37a71 WatchSource:0}: Error finding container 22fc9af5a279c9148eeeedeab7e195c50330df80fbfd4fbc593390a7c9e37a71: Status 404 returned error can't find the container with id 22fc9af5a279c9148eeeedeab7e195c50330df80fbfd4fbc593390a7c9e37a71 Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.442256 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e98929a1-109f-4cbc-91a8-929a852d163b","Type":"ContainerStarted","Data":"020f00a0dfc33966a42af817771e43e1f0d35434f1b23317454a40786c0b9007"} Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.444668 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8d7295db-9761-4829-93af-705e9af64b1c","Type":"ContainerStarted","Data":"c9bfe47e13ec2fcb42134caa5a418aa14c08987f696c80806213ef5453b02c89"} Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.447430 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0bbf5934-e11a-4118-afde-57ecc9d95cd0","Type":"ContainerStarted","Data":"a2971a6fa290e866d9a5c04a97e9ffefc7c8cde97d3f8e43037583c07db7cbaf"} Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.447584 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.450892 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" exitCode=0 Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.450961 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b"} Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.451007 4766 scope.go:117] "RemoveContainer" containerID="0fb3a256512499ae16934c9ba9ee1a59382e5cd9e65f810fb30d76fbdefbf701" Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.451553 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:57:38 crc kubenswrapper[4766]: E1209 04:57:38.451965 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.452703 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"375a1ede-7b02-4a66-aed0-02666883a8f2","Type":"ContainerStarted","Data":"22fc9af5a279c9148eeeedeab7e195c50330df80fbfd4fbc593390a7c9e37a71"} Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.456648 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.474383 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.4743631409999995 podStartE2EDuration="4.474363141s" podCreationTimestamp="2025-12-09 04:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:57:38.46543163 +0000 UTC m=+6340.174737066" watchObservedRunningTime="2025-12-09 04:57:38.474363141 +0000 UTC m=+6340.183668577" Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.512719 4766 scope.go:117] "RemoveContainer" containerID="eda57d876cfc768360846060dd21b3ebad99c69d1a82fda52cbc45b7acfaf5b1" Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.543948 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.017887017 podStartE2EDuration="3.543928702s" podCreationTimestamp="2025-12-09 04:57:35 +0000 UTC" firstStartedPulling="2025-12-09 04:57:36.953992958 +0000 UTC m=+6338.663298384" lastFinishedPulling="2025-12-09 04:57:37.480034643 +0000 UTC m=+6339.189340069" observedRunningTime="2025-12-09 04:57:38.538948647 +0000 UTC m=+6340.248254083" watchObservedRunningTime="2025-12-09 04:57:38.543928702 +0000 UTC m=+6340.253234138" Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.555340 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1efe8f28-9359-4214-911b-594db568c9a5" podUID="8d7295db-9761-4829-93af-705e9af64b1c" Dec 09 04:57:38 crc kubenswrapper[4766]: I1209 04:57:38.924718 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1efe8f28-9359-4214-911b-594db568c9a5" path="/var/lib/kubelet/pods/1efe8f28-9359-4214-911b-594db568c9a5/volumes" Dec 09 04:57:39 crc kubenswrapper[4766]: I1209 04:57:39.401859 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:39 crc kubenswrapper[4766]: I1209 04:57:39.461814 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:39 crc kubenswrapper[4766]: I1209 04:57:39.638228 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ml5b7"] Dec 09 04:57:40 crc kubenswrapper[4766]: I1209 04:57:40.488087 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ml5b7" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerName="registry-server" containerID="cri-o://1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448" gracePeriod=2 Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.103006 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.202960 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-utilities\") pod \"8f631587-0e09-4a40-97a5-83e97c1c2242\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.203785 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-utilities" (OuterVolumeSpecName: "utilities") pod "8f631587-0e09-4a40-97a5-83e97c1c2242" (UID: "8f631587-0e09-4a40-97a5-83e97c1c2242"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.206335 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-catalog-content\") pod \"8f631587-0e09-4a40-97a5-83e97c1c2242\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.211975 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7thmf\" (UniqueName: \"kubernetes.io/projected/8f631587-0e09-4a40-97a5-83e97c1c2242-kube-api-access-7thmf\") pod \"8f631587-0e09-4a40-97a5-83e97c1c2242\" (UID: \"8f631587-0e09-4a40-97a5-83e97c1c2242\") " Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.214077 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.216492 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f631587-0e09-4a40-97a5-83e97c1c2242-kube-api-access-7thmf" (OuterVolumeSpecName: "kube-api-access-7thmf") pod "8f631587-0e09-4a40-97a5-83e97c1c2242" (UID: "8f631587-0e09-4a40-97a5-83e97c1c2242"). InnerVolumeSpecName "kube-api-access-7thmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.267367 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f631587-0e09-4a40-97a5-83e97c1c2242" (UID: "8f631587-0e09-4a40-97a5-83e97c1c2242"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.316463 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f631587-0e09-4a40-97a5-83e97c1c2242-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.316495 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7thmf\" (UniqueName: \"kubernetes.io/projected/8f631587-0e09-4a40-97a5-83e97c1c2242-kube-api-access-7thmf\") on node \"crc\" DevicePath \"\"" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.502966 4766 generic.go:334] "Generic (PLEG): container finished" podID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerID="1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448" exitCode=0 Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.503018 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml5b7" event={"ID":"8f631587-0e09-4a40-97a5-83e97c1c2242","Type":"ContainerDied","Data":"1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448"} Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.503298 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml5b7" event={"ID":"8f631587-0e09-4a40-97a5-83e97c1c2242","Type":"ContainerDied","Data":"84139aca8158efde3900df3beec2832053abd0708b50e2e7998d2246353fd068"} Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.503325 4766 scope.go:117] "RemoveContainer" containerID="1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.503080 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml5b7" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.537322 4766 scope.go:117] "RemoveContainer" containerID="8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.595716 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ml5b7"] Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.607298 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ml5b7"] Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.611786 4766 scope.go:117] "RemoveContainer" containerID="15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.665484 4766 scope.go:117] "RemoveContainer" containerID="1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448" Dec 09 04:57:41 crc kubenswrapper[4766]: E1209 04:57:41.666526 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448\": container with ID starting with 1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448 not found: ID does not exist" containerID="1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.666607 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448"} err="failed to get container status \"1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448\": rpc error: code = NotFound desc = could not find container \"1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448\": container with ID starting with 1cb3e3fc922d0b143f277fc9b37839bb39ecf28e2d365f7dcc97eea9d5346448 not found: ID does not exist" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.666677 4766 scope.go:117] "RemoveContainer" containerID="8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e" Dec 09 04:57:41 crc kubenswrapper[4766]: E1209 04:57:41.667176 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e\": container with ID starting with 8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e not found: ID does not exist" containerID="8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.667269 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e"} err="failed to get container status \"8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e\": rpc error: code = NotFound desc = could not find container \"8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e\": container with ID starting with 8d1edcd128cc5f3f707826969269a5b98c2985a8fbde80cdb0af07bf6484b09e not found: ID does not exist" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.667345 4766 scope.go:117] "RemoveContainer" containerID="15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20" Dec 09 04:57:41 crc kubenswrapper[4766]: E1209 04:57:41.667681 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20\": container with ID starting with 15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20 not found: ID does not exist" containerID="15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20" Dec 09 04:57:41 crc kubenswrapper[4766]: I1209 04:57:41.667764 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20"} err="failed to get container status \"15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20\": rpc error: code = NotFound desc = could not find container \"15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20\": container with ID starting with 15952f0e16cdb298d52b2cf3710a06bc377413d54303d7ff0e9571fb67b9fc20 not found: ID does not exist" Dec 09 04:57:42 crc kubenswrapper[4766]: I1209 04:57:42.853515 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" path="/var/lib/kubelet/pods/8f631587-0e09-4a40-97a5-83e97c1c2242/volumes" Dec 09 04:57:44 crc kubenswrapper[4766]: I1209 04:57:44.540570 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"375a1ede-7b02-4a66-aed0-02666883a8f2","Type":"ContainerStarted","Data":"dcb348a819b660f2a2d6ed145a0cc593898c23fa9b5f12886a97467848ee3155"} Dec 09 04:57:44 crc kubenswrapper[4766]: I1209 04:57:44.542497 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e98929a1-109f-4cbc-91a8-929a852d163b","Type":"ContainerStarted","Data":"45e7473614c1595df03e718db87b319bae4cd61889da01746824030fb7c5c8b5"} Dec 09 04:57:45 crc kubenswrapper[4766]: I1209 04:57:45.461127 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 09 04:57:52 crc kubenswrapper[4766]: I1209 04:57:52.648454 4766 generic.go:334] "Generic (PLEG): container finished" podID="e98929a1-109f-4cbc-91a8-929a852d163b" containerID="45e7473614c1595df03e718db87b319bae4cd61889da01746824030fb7c5c8b5" exitCode=0 Dec 09 04:57:52 crc kubenswrapper[4766]: I1209 04:57:52.648529 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e98929a1-109f-4cbc-91a8-929a852d163b","Type":"ContainerDied","Data":"45e7473614c1595df03e718db87b319bae4cd61889da01746824030fb7c5c8b5"} Dec 09 04:57:52 crc kubenswrapper[4766]: I1209 04:57:52.840599 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:57:52 crc kubenswrapper[4766]: E1209 04:57:52.842526 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:57:53 crc kubenswrapper[4766]: I1209 04:57:53.667036 4766 generic.go:334] "Generic (PLEG): container finished" podID="375a1ede-7b02-4a66-aed0-02666883a8f2" containerID="dcb348a819b660f2a2d6ed145a0cc593898c23fa9b5f12886a97467848ee3155" exitCode=0 Dec 09 04:57:53 crc kubenswrapper[4766]: I1209 04:57:53.667099 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"375a1ede-7b02-4a66-aed0-02666883a8f2","Type":"ContainerDied","Data":"dcb348a819b660f2a2d6ed145a0cc593898c23fa9b5f12886a97467848ee3155"} Dec 09 04:57:55 crc kubenswrapper[4766]: I1209 04:57:55.704544 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e98929a1-109f-4cbc-91a8-929a852d163b","Type":"ContainerStarted","Data":"0a79a1057746ff6a262251c5c8b9a88f33663c5af607135152db32d07e31d5cc"} Dec 09 04:57:58 crc kubenswrapper[4766]: I1209 04:57:58.745753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e98929a1-109f-4cbc-91a8-929a852d163b","Type":"ContainerStarted","Data":"98eb39fd1eebcc8426a3d74ac251c169b4597cd176b557d58310bae1d3dc7351"} Dec 09 04:57:58 crc kubenswrapper[4766]: I1209 04:57:58.746345 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:58 crc kubenswrapper[4766]: I1209 04:57:58.748784 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 09 04:57:58 crc kubenswrapper[4766]: I1209 04:57:58.780233 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.500253386 podStartE2EDuration="23.78020174s" podCreationTimestamp="2025-12-09 04:57:35 +0000 UTC" firstStartedPulling="2025-12-09 04:57:37.532307807 +0000 UTC m=+6339.241613233" lastFinishedPulling="2025-12-09 04:57:54.812256141 +0000 UTC m=+6356.521561587" observedRunningTime="2025-12-09 04:57:58.771280738 +0000 UTC m=+6360.480586194" watchObservedRunningTime="2025-12-09 04:57:58.78020174 +0000 UTC m=+6360.489507166" Dec 09 04:57:59 crc kubenswrapper[4766]: I1209 04:57:59.761609 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"375a1ede-7b02-4a66-aed0-02666883a8f2","Type":"ContainerStarted","Data":"bbd0dd2f5e8293b403c884425b24c1c6ef14651b6c4e08de0b22a128c00ea419"} Dec 09 04:58:04 crc kubenswrapper[4766]: I1209 04:58:04.828752 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"375a1ede-7b02-4a66-aed0-02666883a8f2","Type":"ContainerStarted","Data":"76066477dc55632028ea30db32ab94fd74b262c67fc6ae8bd2ae25c3f9cb21ae"} Dec 09 04:58:05 crc kubenswrapper[4766]: I1209 04:58:05.839940 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:58:05 crc kubenswrapper[4766]: E1209 04:58:05.840531 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:58:08 crc kubenswrapper[4766]: I1209 04:58:08.895065 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"375a1ede-7b02-4a66-aed0-02666883a8f2","Type":"ContainerStarted","Data":"becc465c086b99fef7f61e483d6976a56555b9cfefbbd55b900b0ec2582150c7"} Dec 09 04:58:08 crc kubenswrapper[4766]: I1209 04:58:08.921578 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.984092474 podStartE2EDuration="33.921559396s" podCreationTimestamp="2025-12-09 04:57:35 +0000 UTC" firstStartedPulling="2025-12-09 04:57:37.825506654 +0000 UTC m=+6339.534812080" lastFinishedPulling="2025-12-09 04:58:07.762973576 +0000 UTC m=+6369.472279002" observedRunningTime="2025-12-09 04:58:08.918608076 +0000 UTC m=+6370.627913502" watchObservedRunningTime="2025-12-09 04:58:08.921559396 +0000 UTC m=+6370.630864822" Dec 09 04:58:12 crc kubenswrapper[4766]: I1209 04:58:12.250674 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 09 04:58:14 crc kubenswrapper[4766]: I1209 04:58:14.040918 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-n4llq"] Dec 09 04:58:14 crc kubenswrapper[4766]: I1209 04:58:14.051473 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qnm8w"] Dec 09 04:58:14 crc kubenswrapper[4766]: I1209 04:58:14.061780 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qnm8w"] Dec 09 04:58:14 crc kubenswrapper[4766]: I1209 04:58:14.072860 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-n4llq"] Dec 09 04:58:14 crc kubenswrapper[4766]: I1209 04:58:14.856498 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0daf6d0a-9850-471a-aaaa-dfe3ca829a7d" path="/var/lib/kubelet/pods/0daf6d0a-9850-471a-aaaa-dfe3ca829a7d/volumes" Dec 09 04:58:14 crc kubenswrapper[4766]: I1209 04:58:14.857363 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6005b1e9-bb74-426c-ae88-061a16303fdf" path="/var/lib/kubelet/pods/6005b1e9-bb74-426c-ae88-061a16303fdf/volumes" Dec 09 04:58:15 crc kubenswrapper[4766]: I1209 04:58:15.514789 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-9bpvj"] Dec 09 04:58:15 crc kubenswrapper[4766]: I1209 04:58:15.514846 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d0a7-account-create-update-zgx2j"] Dec 09 04:58:15 crc kubenswrapper[4766]: I1209 04:58:15.514862 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b4cc-account-create-update-c7wqv"] Dec 09 04:58:15 crc kubenswrapper[4766]: I1209 04:58:15.514872 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-bc6f-account-create-update-5v5bc"] Dec 09 04:58:15 crc kubenswrapper[4766]: I1209 04:58:15.514897 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-9bpvj"] Dec 09 04:58:15 crc kubenswrapper[4766]: I1209 04:58:15.514916 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d0a7-account-create-update-zgx2j"] Dec 09 04:58:15 crc kubenswrapper[4766]: I1209 04:58:15.514927 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b4cc-account-create-update-c7wqv"] Dec 09 04:58:15 crc kubenswrapper[4766]: I1209 04:58:15.514937 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-bc6f-account-create-update-5v5bc"] Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.111588 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:58:16 crc kubenswrapper[4766]: E1209 04:58:16.112539 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerName="registry-server" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.112612 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerName="registry-server" Dec 09 04:58:16 crc kubenswrapper[4766]: E1209 04:58:16.112666 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerName="extract-utilities" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.112714 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerName="extract-utilities" Dec 09 04:58:16 crc kubenswrapper[4766]: E1209 04:58:16.112803 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerName="extract-content" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.112856 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerName="extract-content" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.113090 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f631587-0e09-4a40-97a5-83e97c1c2242" containerName="registry-server" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.115081 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.118057 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.120486 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.123947 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.218406 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.218516 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58f8q\" (UniqueName: \"kubernetes.io/projected/cdf4e098-118e-4689-9009-118fc46215df-kube-api-access-58f8q\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.218550 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-run-httpd\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.218586 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.218651 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-config-data\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.218693 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-log-httpd\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.218732 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-scripts\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.334933 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-config-data\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.335059 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-log-httpd\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.335312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-scripts\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.335401 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.335571 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58f8q\" (UniqueName: \"kubernetes.io/projected/cdf4e098-118e-4689-9009-118fc46215df-kube-api-access-58f8q\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.335745 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-run-httpd\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.335918 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.341635 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-log-httpd\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.343527 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-run-httpd\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.345691 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-config-data\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.353924 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-scripts\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.354279 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.362355 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58f8q\" (UniqueName: \"kubernetes.io/projected/cdf4e098-118e-4689-9009-118fc46215df-kube-api-access-58f8q\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.368943 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.438198 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.857737 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed" path="/var/lib/kubelet/pods/0da2c5d7-0a81-43ac-b5bf-dbab2ec572ed/volumes" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.859473 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6" path="/var/lib/kubelet/pods/1b0b6d0d-1057-441d-9ed1-ec95aa9f73e6/volumes" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.860299 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b843c5e-e828-45a8-ad5c-d45c63c9d4dd" path="/var/lib/kubelet/pods/3b843c5e-e828-45a8-ad5c-d45c63c9d4dd/volumes" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.861065 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3d564da-9292-4470-98df-262313eab805" path="/var/lib/kubelet/pods/c3d564da-9292-4470-98df-262313eab805/volumes" Dec 09 04:58:16 crc kubenswrapper[4766]: I1209 04:58:16.921262 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:58:16 crc kubenswrapper[4766]: W1209 04:58:16.922134 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdf4e098_118e_4689_9009_118fc46215df.slice/crio-4fc18de7e1e5558a9f8240e048d7d9bbc6001024bb58fad596c30778962febf9 WatchSource:0}: Error finding container 4fc18de7e1e5558a9f8240e048d7d9bbc6001024bb58fad596c30778962febf9: Status 404 returned error can't find the container with id 4fc18de7e1e5558a9f8240e048d7d9bbc6001024bb58fad596c30778962febf9 Dec 09 04:58:17 crc kubenswrapper[4766]: I1209 04:58:17.016267 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerStarted","Data":"4fc18de7e1e5558a9f8240e048d7d9bbc6001024bb58fad596c30778962febf9"} Dec 09 04:58:18 crc kubenswrapper[4766]: I1209 04:58:18.034576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerStarted","Data":"1ad853eb8fb6baba5cda29808a8c3dfb304d52c785a2f44cb621d82e3e219321"} Dec 09 04:58:19 crc kubenswrapper[4766]: I1209 04:58:19.045885 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerStarted","Data":"829fdb86937093eba839915314f3d4df406acb97dcbd038b9a91a50d745d7e54"} Dec 09 04:58:19 crc kubenswrapper[4766]: I1209 04:58:19.838985 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:58:19 crc kubenswrapper[4766]: E1209 04:58:19.839561 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:58:20 crc kubenswrapper[4766]: I1209 04:58:20.062927 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerStarted","Data":"8e31149e8a26e2858fd11e324f3e0cc84a94d578e538873745833f46d77ea690"} Dec 09 04:58:21 crc kubenswrapper[4766]: I1209 04:58:21.084461 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerStarted","Data":"ab32a41a7e73b6ea79dba1b62b47ea3cbf9102c74edb5891ba6ea56e7772af52"} Dec 09 04:58:21 crc kubenswrapper[4766]: I1209 04:58:21.086189 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 04:58:21 crc kubenswrapper[4766]: I1209 04:58:21.119333 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.589362725 podStartE2EDuration="5.11931257s" podCreationTimestamp="2025-12-09 04:58:16 +0000 UTC" firstStartedPulling="2025-12-09 04:58:16.92462545 +0000 UTC m=+6378.633930896" lastFinishedPulling="2025-12-09 04:58:20.454575315 +0000 UTC m=+6382.163880741" observedRunningTime="2025-12-09 04:58:21.111535989 +0000 UTC m=+6382.820841445" watchObservedRunningTime="2025-12-09 04:58:21.11931257 +0000 UTC m=+6382.828617996" Dec 09 04:58:22 crc kubenswrapper[4766]: I1209 04:58:22.250886 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 09 04:58:22 crc kubenswrapper[4766]: I1209 04:58:22.255281 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 09 04:58:22 crc kubenswrapper[4766]: I1209 04:58:22.927682 4766 scope.go:117] "RemoveContainer" containerID="e9064e2f4b501eeb5dd16aeb6769838465b8cce0ba57d339a919f19b7ea616a7" Dec 09 04:58:22 crc kubenswrapper[4766]: I1209 04:58:22.964814 4766 scope.go:117] "RemoveContainer" containerID="93b20b732e67e1f721660a541780a2f23abee214b8a04d7d5617c502663c1003" Dec 09 04:58:23 crc kubenswrapper[4766]: I1209 04:58:23.046190 4766 scope.go:117] "RemoveContainer" containerID="ea3089b388f93b90c4febab01ccdef0444771b341f536b59f798f9039a8f7bea" Dec 09 04:58:23 crc kubenswrapper[4766]: I1209 04:58:23.054393 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82hr8"] Dec 09 04:58:23 crc kubenswrapper[4766]: I1209 04:58:23.067687 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82hr8"] Dec 09 04:58:23 crc kubenswrapper[4766]: I1209 04:58:23.122592 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 09 04:58:23 crc kubenswrapper[4766]: I1209 04:58:23.127112 4766 scope.go:117] "RemoveContainer" containerID="a070f7d80b9e18f2f8b372c2fdde9e51753dfc93f34db06b5b7f6ba29168d39e" Dec 09 04:58:23 crc kubenswrapper[4766]: I1209 04:58:23.228066 4766 scope.go:117] "RemoveContainer" containerID="4976aed5bee782c4796147d30bd20323b7b67bf1b0b3b8a4803d1732f49e2c71" Dec 09 04:58:23 crc kubenswrapper[4766]: I1209 04:58:23.308414 4766 scope.go:117] "RemoveContainer" containerID="e5ba430ea6d7bf02f687174100578640b5e7dea967aadb2ff404b2ac4f3e6ae5" Dec 09 04:58:24 crc kubenswrapper[4766]: I1209 04:58:24.855933 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d" path="/var/lib/kubelet/pods/2d2e5e99-a8fb-4fde-a433-bc2998c3dd0d/volumes" Dec 09 04:58:27 crc kubenswrapper[4766]: I1209 04:58:27.762498 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-jklmr"] Dec 09 04:58:27 crc kubenswrapper[4766]: I1209 04:58:27.764784 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:27 crc kubenswrapper[4766]: I1209 04:58:27.772308 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jklmr"] Dec 09 04:58:27 crc kubenswrapper[4766]: I1209 04:58:27.865640 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-ec05-account-create-update-ls9kv"] Dec 09 04:58:27 crc kubenswrapper[4766]: I1209 04:58:27.867156 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:27 crc kubenswrapper[4766]: I1209 04:58:27.869658 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 09 04:58:27 crc kubenswrapper[4766]: I1209 04:58:27.878392 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-ec05-account-create-update-ls9kv"] Dec 09 04:58:27 crc kubenswrapper[4766]: I1209 04:58:27.922800 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r7xg\" (UniqueName: \"kubernetes.io/projected/b0881fad-40f0-492b-ac13-742c63e7f260-kube-api-access-5r7xg\") pod \"aodh-db-create-jklmr\" (UID: \"b0881fad-40f0-492b-ac13-742c63e7f260\") " pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:27 crc kubenswrapper[4766]: I1209 04:58:27.922904 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0881fad-40f0-492b-ac13-742c63e7f260-operator-scripts\") pod \"aodh-db-create-jklmr\" (UID: \"b0881fad-40f0-492b-ac13-742c63e7f260\") " pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.024649 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r7xg\" (UniqueName: \"kubernetes.io/projected/b0881fad-40f0-492b-ac13-742c63e7f260-kube-api-access-5r7xg\") pod \"aodh-db-create-jklmr\" (UID: \"b0881fad-40f0-492b-ac13-742c63e7f260\") " pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.024693 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-operator-scripts\") pod \"aodh-ec05-account-create-update-ls9kv\" (UID: \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\") " pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.024769 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0881fad-40f0-492b-ac13-742c63e7f260-operator-scripts\") pod \"aodh-db-create-jklmr\" (UID: \"b0881fad-40f0-492b-ac13-742c63e7f260\") " pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.024795 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxmcw\" (UniqueName: \"kubernetes.io/projected/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-kube-api-access-hxmcw\") pod \"aodh-ec05-account-create-update-ls9kv\" (UID: \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\") " pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.025566 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0881fad-40f0-492b-ac13-742c63e7f260-operator-scripts\") pod \"aodh-db-create-jklmr\" (UID: \"b0881fad-40f0-492b-ac13-742c63e7f260\") " pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.055697 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r7xg\" (UniqueName: \"kubernetes.io/projected/b0881fad-40f0-492b-ac13-742c63e7f260-kube-api-access-5r7xg\") pod \"aodh-db-create-jklmr\" (UID: \"b0881fad-40f0-492b-ac13-742c63e7f260\") " pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.082620 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.128254 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-operator-scripts\") pod \"aodh-ec05-account-create-update-ls9kv\" (UID: \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\") " pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.128847 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxmcw\" (UniqueName: \"kubernetes.io/projected/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-kube-api-access-hxmcw\") pod \"aodh-ec05-account-create-update-ls9kv\" (UID: \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\") " pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.129167 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-operator-scripts\") pod \"aodh-ec05-account-create-update-ls9kv\" (UID: \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\") " pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.149135 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxmcw\" (UniqueName: \"kubernetes.io/projected/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-kube-api-access-hxmcw\") pod \"aodh-ec05-account-create-update-ls9kv\" (UID: \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\") " pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.188750 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.691169 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-ec05-account-create-update-ls9kv"] Dec 09 04:58:28 crc kubenswrapper[4766]: W1209 04:58:28.697680 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0b5b8f_1a7e_4281_be26_72dd9e91573e.slice/crio-183e1a85e40914570f4aae39f7891910ea61bab7b6e1c05c9161baffcb2e61cd WatchSource:0}: Error finding container 183e1a85e40914570f4aae39f7891910ea61bab7b6e1c05c9161baffcb2e61cd: Status 404 returned error can't find the container with id 183e1a85e40914570f4aae39f7891910ea61bab7b6e1c05c9161baffcb2e61cd Dec 09 04:58:28 crc kubenswrapper[4766]: I1209 04:58:28.700835 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jklmr"] Dec 09 04:58:29 crc kubenswrapper[4766]: I1209 04:58:29.207679 4766 generic.go:334] "Generic (PLEG): container finished" podID="5b0b5b8f-1a7e-4281-be26-72dd9e91573e" containerID="e451e0a3fa8bfefe38cb73a790738200f8dd7a5de5b3e02e78d024d38a1a1b67" exitCode=0 Dec 09 04:58:29 crc kubenswrapper[4766]: I1209 04:58:29.207763 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-ec05-account-create-update-ls9kv" event={"ID":"5b0b5b8f-1a7e-4281-be26-72dd9e91573e","Type":"ContainerDied","Data":"e451e0a3fa8bfefe38cb73a790738200f8dd7a5de5b3e02e78d024d38a1a1b67"} Dec 09 04:58:29 crc kubenswrapper[4766]: I1209 04:58:29.208111 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-ec05-account-create-update-ls9kv" event={"ID":"5b0b5b8f-1a7e-4281-be26-72dd9e91573e","Type":"ContainerStarted","Data":"183e1a85e40914570f4aae39f7891910ea61bab7b6e1c05c9161baffcb2e61cd"} Dec 09 04:58:29 crc kubenswrapper[4766]: I1209 04:58:29.209854 4766 generic.go:334] "Generic (PLEG): container finished" podID="b0881fad-40f0-492b-ac13-742c63e7f260" containerID="de522f19ed5448ffb6f0ed59bbe038f1c300b2ed636208004edb4ca7c9fb83e0" exitCode=0 Dec 09 04:58:29 crc kubenswrapper[4766]: I1209 04:58:29.209892 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jklmr" event={"ID":"b0881fad-40f0-492b-ac13-742c63e7f260","Type":"ContainerDied","Data":"de522f19ed5448ffb6f0ed59bbe038f1c300b2ed636208004edb4ca7c9fb83e0"} Dec 09 04:58:29 crc kubenswrapper[4766]: I1209 04:58:29.209918 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jklmr" event={"ID":"b0881fad-40f0-492b-ac13-742c63e7f260","Type":"ContainerStarted","Data":"6ac2ab8c2bcdfe438324bfe58988590fb99b3b4bdff1388d88ff96e93960ed86"} Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.869463 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.876334 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.991749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxmcw\" (UniqueName: \"kubernetes.io/projected/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-kube-api-access-hxmcw\") pod \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\" (UID: \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\") " Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.991818 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-operator-scripts\") pod \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\" (UID: \"5b0b5b8f-1a7e-4281-be26-72dd9e91573e\") " Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.991852 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0881fad-40f0-492b-ac13-742c63e7f260-operator-scripts\") pod \"b0881fad-40f0-492b-ac13-742c63e7f260\" (UID: \"b0881fad-40f0-492b-ac13-742c63e7f260\") " Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.991954 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r7xg\" (UniqueName: \"kubernetes.io/projected/b0881fad-40f0-492b-ac13-742c63e7f260-kube-api-access-5r7xg\") pod \"b0881fad-40f0-492b-ac13-742c63e7f260\" (UID: \"b0881fad-40f0-492b-ac13-742c63e7f260\") " Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.992483 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0881fad-40f0-492b-ac13-742c63e7f260-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0881fad-40f0-492b-ac13-742c63e7f260" (UID: "b0881fad-40f0-492b-ac13-742c63e7f260"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.993276 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b0b5b8f-1a7e-4281-be26-72dd9e91573e" (UID: "5b0b5b8f-1a7e-4281-be26-72dd9e91573e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.994078 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:30 crc kubenswrapper[4766]: I1209 04:58:30.994101 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0881fad-40f0-492b-ac13-742c63e7f260-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.000424 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0881fad-40f0-492b-ac13-742c63e7f260-kube-api-access-5r7xg" (OuterVolumeSpecName: "kube-api-access-5r7xg") pod "b0881fad-40f0-492b-ac13-742c63e7f260" (UID: "b0881fad-40f0-492b-ac13-742c63e7f260"). InnerVolumeSpecName "kube-api-access-5r7xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.000525 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-kube-api-access-hxmcw" (OuterVolumeSpecName: "kube-api-access-hxmcw") pod "5b0b5b8f-1a7e-4281-be26-72dd9e91573e" (UID: "5b0b5b8f-1a7e-4281-be26-72dd9e91573e"). InnerVolumeSpecName "kube-api-access-hxmcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.095610 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxmcw\" (UniqueName: \"kubernetes.io/projected/5b0b5b8f-1a7e-4281-be26-72dd9e91573e-kube-api-access-hxmcw\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.095649 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r7xg\" (UniqueName: \"kubernetes.io/projected/b0881fad-40f0-492b-ac13-742c63e7f260-kube-api-access-5r7xg\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.241144 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-ec05-account-create-update-ls9kv" event={"ID":"5b0b5b8f-1a7e-4281-be26-72dd9e91573e","Type":"ContainerDied","Data":"183e1a85e40914570f4aae39f7891910ea61bab7b6e1c05c9161baffcb2e61cd"} Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.241207 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="183e1a85e40914570f4aae39f7891910ea61bab7b6e1c05c9161baffcb2e61cd" Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.241242 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-ec05-account-create-update-ls9kv" Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.245368 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jklmr" event={"ID":"b0881fad-40f0-492b-ac13-742c63e7f260","Type":"ContainerDied","Data":"6ac2ab8c2bcdfe438324bfe58988590fb99b3b4bdff1388d88ff96e93960ed86"} Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.245417 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jklmr" Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.245445 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ac2ab8c2bcdfe438324bfe58988590fb99b3b4bdff1388d88ff96e93960ed86" Dec 09 04:58:31 crc kubenswrapper[4766]: I1209 04:58:31.840107 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:58:31 crc kubenswrapper[4766]: E1209 04:58:31.840774 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.178285 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-xczbt"] Dec 09 04:58:33 crc kubenswrapper[4766]: E1209 04:58:33.179074 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0b5b8f-1a7e-4281-be26-72dd9e91573e" containerName="mariadb-account-create-update" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.179088 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0b5b8f-1a7e-4281-be26-72dd9e91573e" containerName="mariadb-account-create-update" Dec 09 04:58:33 crc kubenswrapper[4766]: E1209 04:58:33.179111 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0881fad-40f0-492b-ac13-742c63e7f260" containerName="mariadb-database-create" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.179119 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0881fad-40f0-492b-ac13-742c63e7f260" containerName="mariadb-database-create" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.179358 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0881fad-40f0-492b-ac13-742c63e7f260" containerName="mariadb-database-create" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.179372 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0b5b8f-1a7e-4281-be26-72dd9e91573e" containerName="mariadb-account-create-update" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.180127 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.182654 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.182838 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-bjm6f" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.183916 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.184120 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.191826 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-xczbt"] Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.249688 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcbz\" (UniqueName: \"kubernetes.io/projected/7ef90510-1996-4325-9bf1-2b68a0c95146-kube-api-access-4wcbz\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.249730 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-combined-ca-bundle\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.249777 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-config-data\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.249809 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-scripts\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.352412 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-config-data\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.352506 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-scripts\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.352834 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcbz\" (UniqueName: \"kubernetes.io/projected/7ef90510-1996-4325-9bf1-2b68a0c95146-kube-api-access-4wcbz\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.352890 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-combined-ca-bundle\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.363378 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-config-data\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.370642 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-combined-ca-bundle\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.376318 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-scripts\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.388027 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcbz\" (UniqueName: \"kubernetes.io/projected/7ef90510-1996-4325-9bf1-2b68a0c95146-kube-api-access-4wcbz\") pod \"aodh-db-sync-xczbt\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:33 crc kubenswrapper[4766]: I1209 04:58:33.542851 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:34 crc kubenswrapper[4766]: I1209 04:58:34.089921 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-xczbt"] Dec 09 04:58:34 crc kubenswrapper[4766]: I1209 04:58:34.307688 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-xczbt" event={"ID":"7ef90510-1996-4325-9bf1-2b68a0c95146","Type":"ContainerStarted","Data":"158500893f017c943b45e3a954bf91dfb342e33dfff73c53e93aea91138c87eb"} Dec 09 04:58:38 crc kubenswrapper[4766]: I1209 04:58:38.057550 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7j22r"] Dec 09 04:58:38 crc kubenswrapper[4766]: I1209 04:58:38.069972 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7j22r"] Dec 09 04:58:38 crc kubenswrapper[4766]: I1209 04:58:38.854483 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877cf27e-eccf-4910-97c0-3fc7793f63a9" path="/var/lib/kubelet/pods/877cf27e-eccf-4910-97c0-3fc7793f63a9/volumes" Dec 09 04:58:39 crc kubenswrapper[4766]: I1209 04:58:39.026278 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kc2gn"] Dec 09 04:58:39 crc kubenswrapper[4766]: I1209 04:58:39.036663 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kc2gn"] Dec 09 04:58:39 crc kubenswrapper[4766]: I1209 04:58:39.372474 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-xczbt" event={"ID":"7ef90510-1996-4325-9bf1-2b68a0c95146","Type":"ContainerStarted","Data":"958910ec0b9e1c9d51a524072f28c4417b5ba1b2dc63a066bde08d8ea860bd94"} Dec 09 04:58:39 crc kubenswrapper[4766]: I1209 04:58:39.407102 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-xczbt" podStartSLOduration=2.427617116 podStartE2EDuration="6.407082337s" podCreationTimestamp="2025-12-09 04:58:33 +0000 UTC" firstStartedPulling="2025-12-09 04:58:34.129875603 +0000 UTC m=+6395.839181029" lastFinishedPulling="2025-12-09 04:58:38.109340824 +0000 UTC m=+6399.818646250" observedRunningTime="2025-12-09 04:58:39.399798749 +0000 UTC m=+6401.109104205" watchObservedRunningTime="2025-12-09 04:58:39.407082337 +0000 UTC m=+6401.116387773" Dec 09 04:58:40 crc kubenswrapper[4766]: I1209 04:58:40.857016 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98381e1d-1d58-4913-b9da-e5fd14a6eed9" path="/var/lib/kubelet/pods/98381e1d-1d58-4913-b9da-e5fd14a6eed9/volumes" Dec 09 04:58:41 crc kubenswrapper[4766]: I1209 04:58:41.417298 4766 generic.go:334] "Generic (PLEG): container finished" podID="7ef90510-1996-4325-9bf1-2b68a0c95146" containerID="958910ec0b9e1c9d51a524072f28c4417b5ba1b2dc63a066bde08d8ea860bd94" exitCode=0 Dec 09 04:58:41 crc kubenswrapper[4766]: I1209 04:58:41.417364 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-xczbt" event={"ID":"7ef90510-1996-4325-9bf1-2b68a0c95146","Type":"ContainerDied","Data":"958910ec0b9e1c9d51a524072f28c4417b5ba1b2dc63a066bde08d8ea860bd94"} Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.808657 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.891297 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-config-data\") pod \"7ef90510-1996-4325-9bf1-2b68a0c95146\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.891644 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-combined-ca-bundle\") pod \"7ef90510-1996-4325-9bf1-2b68a0c95146\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.891950 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-scripts\") pod \"7ef90510-1996-4325-9bf1-2b68a0c95146\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.892236 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wcbz\" (UniqueName: \"kubernetes.io/projected/7ef90510-1996-4325-9bf1-2b68a0c95146-kube-api-access-4wcbz\") pod \"7ef90510-1996-4325-9bf1-2b68a0c95146\" (UID: \"7ef90510-1996-4325-9bf1-2b68a0c95146\") " Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.896792 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-scripts" (OuterVolumeSpecName: "scripts") pod "7ef90510-1996-4325-9bf1-2b68a0c95146" (UID: "7ef90510-1996-4325-9bf1-2b68a0c95146"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.898647 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef90510-1996-4325-9bf1-2b68a0c95146-kube-api-access-4wcbz" (OuterVolumeSpecName: "kube-api-access-4wcbz") pod "7ef90510-1996-4325-9bf1-2b68a0c95146" (UID: "7ef90510-1996-4325-9bf1-2b68a0c95146"). InnerVolumeSpecName "kube-api-access-4wcbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.928709 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-config-data" (OuterVolumeSpecName: "config-data") pod "7ef90510-1996-4325-9bf1-2b68a0c95146" (UID: "7ef90510-1996-4325-9bf1-2b68a0c95146"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.943826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ef90510-1996-4325-9bf1-2b68a0c95146" (UID: "7ef90510-1996-4325-9bf1-2b68a0c95146"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.995154 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wcbz\" (UniqueName: \"kubernetes.io/projected/7ef90510-1996-4325-9bf1-2b68a0c95146-kube-api-access-4wcbz\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.995227 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.995251 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:42 crc kubenswrapper[4766]: I1209 04:58:42.995265 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef90510-1996-4325-9bf1-2b68a0c95146-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:43 crc kubenswrapper[4766]: I1209 04:58:43.439712 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-xczbt" event={"ID":"7ef90510-1996-4325-9bf1-2b68a0c95146","Type":"ContainerDied","Data":"158500893f017c943b45e3a954bf91dfb342e33dfff73c53e93aea91138c87eb"} Dec 09 04:58:43 crc kubenswrapper[4766]: I1209 04:58:43.439970 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="158500893f017c943b45e3a954bf91dfb342e33dfff73c53e93aea91138c87eb" Dec 09 04:58:43 crc kubenswrapper[4766]: I1209 04:58:43.439837 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-xczbt" Dec 09 04:58:44 crc kubenswrapper[4766]: I1209 04:58:44.839893 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:58:44 crc kubenswrapper[4766]: E1209 04:58:44.840962 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.513869 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hw4rz"] Dec 09 04:58:45 crc kubenswrapper[4766]: E1209 04:58:45.514322 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef90510-1996-4325-9bf1-2b68a0c95146" containerName="aodh-db-sync" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.514338 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef90510-1996-4325-9bf1-2b68a0c95146" containerName="aodh-db-sync" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.514567 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef90510-1996-4325-9bf1-2b68a0c95146" containerName="aodh-db-sync" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.516133 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.531870 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw4rz"] Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.560441 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-utilities\") pod \"redhat-marketplace-hw4rz\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.560508 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-catalog-content\") pod \"redhat-marketplace-hw4rz\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.560638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8tv6\" (UniqueName: \"kubernetes.io/projected/89f9297b-eee0-4e28-ba93-e13e8128be16-kube-api-access-h8tv6\") pod \"redhat-marketplace-hw4rz\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.662387 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-utilities\") pod \"redhat-marketplace-hw4rz\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.662459 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-catalog-content\") pod \"redhat-marketplace-hw4rz\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.662545 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8tv6\" (UniqueName: \"kubernetes.io/projected/89f9297b-eee0-4e28-ba93-e13e8128be16-kube-api-access-h8tv6\") pod \"redhat-marketplace-hw4rz\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.662832 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-utilities\") pod \"redhat-marketplace-hw4rz\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.662871 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-catalog-content\") pod \"redhat-marketplace-hw4rz\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.685485 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8tv6\" (UniqueName: \"kubernetes.io/projected/89f9297b-eee0-4e28-ba93-e13e8128be16-kube-api-access-h8tv6\") pod \"redhat-marketplace-hw4rz\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:45 crc kubenswrapper[4766]: I1209 04:58:45.873908 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:46 crc kubenswrapper[4766]: I1209 04:58:46.384101 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw4rz"] Dec 09 04:58:46 crc kubenswrapper[4766]: I1209 04:58:46.451320 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 04:58:46 crc kubenswrapper[4766]: I1209 04:58:46.487924 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw4rz" event={"ID":"89f9297b-eee0-4e28-ba93-e13e8128be16","Type":"ContainerStarted","Data":"a5fbbc4119084fcad59177048fe3f6f2f651a233561eca4539f808a6aa4012b8"} Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.499461 4766 generic.go:334] "Generic (PLEG): container finished" podID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerID="cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5" exitCode=0 Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.499517 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw4rz" event={"ID":"89f9297b-eee0-4e28-ba93-e13e8128be16","Type":"ContainerDied","Data":"cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5"} Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.876182 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.883816 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.892789 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.893105 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-bjm6f" Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.893251 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.907261 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.907506 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/280582e1-1357-4079-bf39-4328d162cbe8-scripts\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.907602 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280582e1-1357-4079-bf39-4328d162cbe8-config-data\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.907644 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280582e1-1357-4079-bf39-4328d162cbe8-combined-ca-bundle\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:47 crc kubenswrapper[4766]: I1209 04:58:47.907727 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj265\" (UniqueName: \"kubernetes.io/projected/280582e1-1357-4079-bf39-4328d162cbe8-kube-api-access-sj265\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.009776 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/280582e1-1357-4079-bf39-4328d162cbe8-scripts\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.010365 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280582e1-1357-4079-bf39-4328d162cbe8-config-data\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.010401 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280582e1-1357-4079-bf39-4328d162cbe8-combined-ca-bundle\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.010444 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj265\" (UniqueName: \"kubernetes.io/projected/280582e1-1357-4079-bf39-4328d162cbe8-kube-api-access-sj265\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.016874 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280582e1-1357-4079-bf39-4328d162cbe8-combined-ca-bundle\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.021494 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/280582e1-1357-4079-bf39-4328d162cbe8-config-data\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.024637 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/280582e1-1357-4079-bf39-4328d162cbe8-scripts\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.040051 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj265\" (UniqueName: \"kubernetes.io/projected/280582e1-1357-4079-bf39-4328d162cbe8-kube-api-access-sj265\") pod \"aodh-0\" (UID: \"280582e1-1357-4079-bf39-4328d162cbe8\") " pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.234361 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.518672 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw4rz" event={"ID":"89f9297b-eee0-4e28-ba93-e13e8128be16","Type":"ContainerStarted","Data":"3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57"} Dec 09 04:58:48 crc kubenswrapper[4766]: I1209 04:58:48.737309 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 09 04:58:49 crc kubenswrapper[4766]: I1209 04:58:49.530050 4766 generic.go:334] "Generic (PLEG): container finished" podID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerID="3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57" exitCode=0 Dec 09 04:58:49 crc kubenswrapper[4766]: I1209 04:58:49.530500 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw4rz" event={"ID":"89f9297b-eee0-4e28-ba93-e13e8128be16","Type":"ContainerDied","Data":"3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57"} Dec 09 04:58:49 crc kubenswrapper[4766]: I1209 04:58:49.532513 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"280582e1-1357-4079-bf39-4328d162cbe8","Type":"ContainerStarted","Data":"960368f0839d0c2c1d428d8f109a23ed9ce6a870bf9b8ab9c618bd829c2c3fd6"} Dec 09 04:58:49 crc kubenswrapper[4766]: I1209 04:58:49.532699 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"280582e1-1357-4079-bf39-4328d162cbe8","Type":"ContainerStarted","Data":"26880766fc721dd280a2240c05cc7475b00242f53c6fcc9e3705dffd75d42c86"} Dec 09 04:58:49 crc kubenswrapper[4766]: I1209 04:58:49.824569 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:58:49 crc kubenswrapper[4766]: I1209 04:58:49.824852 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="ceilometer-central-agent" containerID="cri-o://1ad853eb8fb6baba5cda29808a8c3dfb304d52c785a2f44cb621d82e3e219321" gracePeriod=30 Dec 09 04:58:49 crc kubenswrapper[4766]: I1209 04:58:49.824939 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="sg-core" containerID="cri-o://8e31149e8a26e2858fd11e324f3e0cc84a94d578e538873745833f46d77ea690" gracePeriod=30 Dec 09 04:58:49 crc kubenswrapper[4766]: I1209 04:58:49.825104 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="ceilometer-notification-agent" containerID="cri-o://829fdb86937093eba839915314f3d4df406acb97dcbd038b9a91a50d745d7e54" gracePeriod=30 Dec 09 04:58:49 crc kubenswrapper[4766]: I1209 04:58:49.825112 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="proxy-httpd" containerID="cri-o://ab32a41a7e73b6ea79dba1b62b47ea3cbf9102c74edb5891ba6ea56e7772af52" gracePeriod=30 Dec 09 04:58:50 crc kubenswrapper[4766]: I1209 04:58:50.544956 4766 generic.go:334] "Generic (PLEG): container finished" podID="cdf4e098-118e-4689-9009-118fc46215df" containerID="ab32a41a7e73b6ea79dba1b62b47ea3cbf9102c74edb5891ba6ea56e7772af52" exitCode=0 Dec 09 04:58:50 crc kubenswrapper[4766]: I1209 04:58:50.544981 4766 generic.go:334] "Generic (PLEG): container finished" podID="cdf4e098-118e-4689-9009-118fc46215df" containerID="8e31149e8a26e2858fd11e324f3e0cc84a94d578e538873745833f46d77ea690" exitCode=2 Dec 09 04:58:50 crc kubenswrapper[4766]: I1209 04:58:50.544991 4766 generic.go:334] "Generic (PLEG): container finished" podID="cdf4e098-118e-4689-9009-118fc46215df" containerID="1ad853eb8fb6baba5cda29808a8c3dfb304d52c785a2f44cb621d82e3e219321" exitCode=0 Dec 09 04:58:50 crc kubenswrapper[4766]: I1209 04:58:50.545048 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerDied","Data":"ab32a41a7e73b6ea79dba1b62b47ea3cbf9102c74edb5891ba6ea56e7772af52"} Dec 09 04:58:50 crc kubenswrapper[4766]: I1209 04:58:50.545095 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerDied","Data":"8e31149e8a26e2858fd11e324f3e0cc84a94d578e538873745833f46d77ea690"} Dec 09 04:58:50 crc kubenswrapper[4766]: I1209 04:58:50.545107 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerDied","Data":"1ad853eb8fb6baba5cda29808a8c3dfb304d52c785a2f44cb621d82e3e219321"} Dec 09 04:58:50 crc kubenswrapper[4766]: I1209 04:58:50.547624 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw4rz" event={"ID":"89f9297b-eee0-4e28-ba93-e13e8128be16","Type":"ContainerStarted","Data":"dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d"} Dec 09 04:58:50 crc kubenswrapper[4766]: I1209 04:58:50.576014 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hw4rz" podStartSLOduration=3.132322938 podStartE2EDuration="5.575995578s" podCreationTimestamp="2025-12-09 04:58:45 +0000 UTC" firstStartedPulling="2025-12-09 04:58:47.50225132 +0000 UTC m=+6409.211556746" lastFinishedPulling="2025-12-09 04:58:49.94592397 +0000 UTC m=+6411.655229386" observedRunningTime="2025-12-09 04:58:50.570704715 +0000 UTC m=+6412.280010141" watchObservedRunningTime="2025-12-09 04:58:50.575995578 +0000 UTC m=+6412.285301004" Dec 09 04:58:51 crc kubenswrapper[4766]: I1209 04:58:51.588259 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"280582e1-1357-4079-bf39-4328d162cbe8","Type":"ContainerStarted","Data":"6e5f14f6c87b55002e4d95cbb36cdfe63b8e84b91169e4e7830b661d7c39a549"} Dec 09 04:58:52 crc kubenswrapper[4766]: I1209 04:58:52.600793 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"280582e1-1357-4079-bf39-4328d162cbe8","Type":"ContainerStarted","Data":"ee4b01c7809a971ebb2a10590139f6df8ba19798a0f7bdd1618107b9044bea96"} Dec 09 04:58:54 crc kubenswrapper[4766]: I1209 04:58:54.626714 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"280582e1-1357-4079-bf39-4328d162cbe8","Type":"ContainerStarted","Data":"c7df745f6d9b8b56ee7ddb19757079861741fcdbbc8e734d3d8fbaf11decfbe0"} Dec 09 04:58:54 crc kubenswrapper[4766]: I1209 04:58:54.631815 4766 generic.go:334] "Generic (PLEG): container finished" podID="cdf4e098-118e-4689-9009-118fc46215df" containerID="829fdb86937093eba839915314f3d4df406acb97dcbd038b9a91a50d745d7e54" exitCode=0 Dec 09 04:58:54 crc kubenswrapper[4766]: I1209 04:58:54.631896 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerDied","Data":"829fdb86937093eba839915314f3d4df406acb97dcbd038b9a91a50d745d7e54"} Dec 09 04:58:54 crc kubenswrapper[4766]: I1209 04:58:54.652944 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.989124318 podStartE2EDuration="7.652925804s" podCreationTimestamp="2025-12-09 04:58:47 +0000 UTC" firstStartedPulling="2025-12-09 04:58:48.749354994 +0000 UTC m=+6410.458660420" lastFinishedPulling="2025-12-09 04:58:53.41315649 +0000 UTC m=+6415.122461906" observedRunningTime="2025-12-09 04:58:54.650236071 +0000 UTC m=+6416.359541507" watchObservedRunningTime="2025-12-09 04:58:54.652925804 +0000 UTC m=+6416.362231230" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.405960 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.472042 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-combined-ca-bundle\") pod \"cdf4e098-118e-4689-9009-118fc46215df\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.472135 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-scripts\") pod \"cdf4e098-118e-4689-9009-118fc46215df\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.472182 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-config-data\") pod \"cdf4e098-118e-4689-9009-118fc46215df\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.473819 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-log-httpd\") pod \"cdf4e098-118e-4689-9009-118fc46215df\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.473886 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-sg-core-conf-yaml\") pod \"cdf4e098-118e-4689-9009-118fc46215df\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.474005 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58f8q\" (UniqueName: \"kubernetes.io/projected/cdf4e098-118e-4689-9009-118fc46215df-kube-api-access-58f8q\") pod \"cdf4e098-118e-4689-9009-118fc46215df\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.474100 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-run-httpd\") pod \"cdf4e098-118e-4689-9009-118fc46215df\" (UID: \"cdf4e098-118e-4689-9009-118fc46215df\") " Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.475233 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cdf4e098-118e-4689-9009-118fc46215df" (UID: "cdf4e098-118e-4689-9009-118fc46215df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.476206 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cdf4e098-118e-4689-9009-118fc46215df" (UID: "cdf4e098-118e-4689-9009-118fc46215df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.483320 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-scripts" (OuterVolumeSpecName: "scripts") pod "cdf4e098-118e-4689-9009-118fc46215df" (UID: "cdf4e098-118e-4689-9009-118fc46215df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.492887 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf4e098-118e-4689-9009-118fc46215df-kube-api-access-58f8q" (OuterVolumeSpecName: "kube-api-access-58f8q") pod "cdf4e098-118e-4689-9009-118fc46215df" (UID: "cdf4e098-118e-4689-9009-118fc46215df"). InnerVolumeSpecName "kube-api-access-58f8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.542452 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cdf4e098-118e-4689-9009-118fc46215df" (UID: "cdf4e098-118e-4689-9009-118fc46215df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.577349 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdf4e098-118e-4689-9009-118fc46215df" (UID: "cdf4e098-118e-4689-9009-118fc46215df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.577505 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58f8q\" (UniqueName: \"kubernetes.io/projected/cdf4e098-118e-4689-9009-118fc46215df-kube-api-access-58f8q\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.577704 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.577775 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.577838 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdf4e098-118e-4689-9009-118fc46215df-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.577899 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.649507 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-config-data" (OuterVolumeSpecName: "config-data") pod "cdf4e098-118e-4689-9009-118fc46215df" (UID: "cdf4e098-118e-4689-9009-118fc46215df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.653833 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdf4e098-118e-4689-9009-118fc46215df","Type":"ContainerDied","Data":"4fc18de7e1e5558a9f8240e048d7d9bbc6001024bb58fad596c30778962febf9"} Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.653890 4766 scope.go:117] "RemoveContainer" containerID="ab32a41a7e73b6ea79dba1b62b47ea3cbf9102c74edb5891ba6ea56e7772af52" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.654320 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.680442 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.680473 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf4e098-118e-4689-9009-118fc46215df-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.751324 4766 scope.go:117] "RemoveContainer" containerID="8e31149e8a26e2858fd11e324f3e0cc84a94d578e538873745833f46d77ea690" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.768916 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.776794 4766 scope.go:117] "RemoveContainer" containerID="829fdb86937093eba839915314f3d4df406acb97dcbd038b9a91a50d745d7e54" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.778756 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.787983 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:58:55 crc kubenswrapper[4766]: E1209 04:58:55.788873 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="sg-core" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.788893 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="sg-core" Dec 09 04:58:55 crc kubenswrapper[4766]: E1209 04:58:55.789869 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="proxy-httpd" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.789885 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="proxy-httpd" Dec 09 04:58:55 crc kubenswrapper[4766]: E1209 04:58:55.789902 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="ceilometer-central-agent" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.789933 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="ceilometer-central-agent" Dec 09 04:58:55 crc kubenswrapper[4766]: E1209 04:58:55.789952 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="ceilometer-notification-agent" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.789959 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="ceilometer-notification-agent" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.790431 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="proxy-httpd" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.793896 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="sg-core" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.793919 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="ceilometer-central-agent" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.793932 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf4e098-118e-4689-9009-118fc46215df" containerName="ceilometer-notification-agent" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.795926 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.798084 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.800979 4766 scope.go:117] "RemoveContainer" containerID="1ad853eb8fb6baba5cda29808a8c3dfb304d52c785a2f44cb621d82e3e219321" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.801135 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.801319 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.874724 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.874776 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.883840 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-log-httpd\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.883898 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-config-data\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.883993 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kncfs\" (UniqueName: \"kubernetes.io/projected/fc6adaf4-b846-4113-a338-1bfeccca7a82-kube-api-access-kncfs\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.884010 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.884057 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-run-httpd\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.884083 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.884134 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-scripts\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.930096 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.984954 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-scripts\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.985045 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-log-httpd\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.985081 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-config-data\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.985139 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kncfs\" (UniqueName: \"kubernetes.io/projected/fc6adaf4-b846-4113-a338-1bfeccca7a82-kube-api-access-kncfs\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.985160 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.985207 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-run-httpd\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.985247 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.985818 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-log-httpd\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.986018 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-run-httpd\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.990371 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-scripts\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.990582 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.991017 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:55 crc kubenswrapper[4766]: I1209 04:58:55.991424 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-config-data\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:56 crc kubenswrapper[4766]: I1209 04:58:56.014544 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kncfs\" (UniqueName: \"kubernetes.io/projected/fc6adaf4-b846-4113-a338-1bfeccca7a82-kube-api-access-kncfs\") pod \"ceilometer-0\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " pod="openstack/ceilometer-0" Dec 09 04:58:56 crc kubenswrapper[4766]: I1209 04:58:56.158750 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:58:56 crc kubenswrapper[4766]: I1209 04:58:56.633124 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:58:56 crc kubenswrapper[4766]: W1209 04:58:56.634152 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc6adaf4_b846_4113_a338_1bfeccca7a82.slice/crio-77944959b05a60557f366a1ac3908a8d526fe6d2aaa7d538f785cf8027d73597 WatchSource:0}: Error finding container 77944959b05a60557f366a1ac3908a8d526fe6d2aaa7d538f785cf8027d73597: Status 404 returned error can't find the container with id 77944959b05a60557f366a1ac3908a8d526fe6d2aaa7d538f785cf8027d73597 Dec 09 04:58:56 crc kubenswrapper[4766]: I1209 04:58:56.665351 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerStarted","Data":"77944959b05a60557f366a1ac3908a8d526fe6d2aaa7d538f785cf8027d73597"} Dec 09 04:58:56 crc kubenswrapper[4766]: I1209 04:58:56.725798 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:56 crc kubenswrapper[4766]: I1209 04:58:56.778096 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw4rz"] Dec 09 04:58:56 crc kubenswrapper[4766]: I1209 04:58:56.855044 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf4e098-118e-4689-9009-118fc46215df" path="/var/lib/kubelet/pods/cdf4e098-118e-4689-9009-118fc46215df/volumes" Dec 09 04:58:57 crc kubenswrapper[4766]: I1209 04:58:57.047278 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7c8md"] Dec 09 04:58:57 crc kubenswrapper[4766]: I1209 04:58:57.054485 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7c8md"] Dec 09 04:58:57 crc kubenswrapper[4766]: I1209 04:58:57.687503 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerStarted","Data":"cc9850683755f976f0bcf6f338252ab996a8d9955376018a5b48a5de13a663e9"} Dec 09 04:58:57 crc kubenswrapper[4766]: I1209 04:58:57.839663 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:58:57 crc kubenswrapper[4766]: E1209 04:58:57.840240 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:58:58 crc kubenswrapper[4766]: I1209 04:58:58.698406 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerStarted","Data":"71507acc2593daebe497e2ad8fd906ccb2348c01a1c6389b199844058257f52b"} Dec 09 04:58:58 crc kubenswrapper[4766]: I1209 04:58:58.698783 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerStarted","Data":"323919ccc4b2929763ab6b4f34106ddfac5c810d5c1736fe900eb2476cedcd53"} Dec 09 04:58:58 crc kubenswrapper[4766]: I1209 04:58:58.698577 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hw4rz" podUID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerName="registry-server" containerID="cri-o://dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d" gracePeriod=2 Dec 09 04:58:58 crc kubenswrapper[4766]: I1209 04:58:58.869781 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e53d15-c42f-4959-9ed5-ddff8241b3d0" path="/var/lib/kubelet/pods/b4e53d15-c42f-4959-9ed5-ddff8241b3d0/volumes" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.270980 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.388653 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8tv6\" (UniqueName: \"kubernetes.io/projected/89f9297b-eee0-4e28-ba93-e13e8128be16-kube-api-access-h8tv6\") pod \"89f9297b-eee0-4e28-ba93-e13e8128be16\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.389772 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-utilities\") pod \"89f9297b-eee0-4e28-ba93-e13e8128be16\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.389833 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-catalog-content\") pod \"89f9297b-eee0-4e28-ba93-e13e8128be16\" (UID: \"89f9297b-eee0-4e28-ba93-e13e8128be16\") " Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.390677 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-utilities" (OuterVolumeSpecName: "utilities") pod "89f9297b-eee0-4e28-ba93-e13e8128be16" (UID: "89f9297b-eee0-4e28-ba93-e13e8128be16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.401398 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f9297b-eee0-4e28-ba93-e13e8128be16-kube-api-access-h8tv6" (OuterVolumeSpecName: "kube-api-access-h8tv6") pod "89f9297b-eee0-4e28-ba93-e13e8128be16" (UID: "89f9297b-eee0-4e28-ba93-e13e8128be16"). InnerVolumeSpecName "kube-api-access-h8tv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.403740 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89f9297b-eee0-4e28-ba93-e13e8128be16" (UID: "89f9297b-eee0-4e28-ba93-e13e8128be16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.404202 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.404246 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8tv6\" (UniqueName: \"kubernetes.io/projected/89f9297b-eee0-4e28-ba93-e13e8128be16-kube-api-access-h8tv6\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.404262 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f9297b-eee0-4e28-ba93-e13e8128be16-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.711878 4766 generic.go:334] "Generic (PLEG): container finished" podID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerID="dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d" exitCode=0 Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.711934 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw4rz" event={"ID":"89f9297b-eee0-4e28-ba93-e13e8128be16","Type":"ContainerDied","Data":"dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d"} Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.711963 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw4rz" event={"ID":"89f9297b-eee0-4e28-ba93-e13e8128be16","Type":"ContainerDied","Data":"a5fbbc4119084fcad59177048fe3f6f2f651a233561eca4539f808a6aa4012b8"} Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.711986 4766 scope.go:117] "RemoveContainer" containerID="dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.712187 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw4rz" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.769963 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw4rz"] Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.770655 4766 scope.go:117] "RemoveContainer" containerID="3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.782428 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw4rz"] Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.834266 4766 scope.go:117] "RemoveContainer" containerID="cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.878176 4766 scope.go:117] "RemoveContainer" containerID="dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d" Dec 09 04:58:59 crc kubenswrapper[4766]: E1209 04:58:59.879883 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d\": container with ID starting with dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d not found: ID does not exist" containerID="dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.879945 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d"} err="failed to get container status \"dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d\": rpc error: code = NotFound desc = could not find container \"dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d\": container with ID starting with dc445464cb73a157f67ad8c482d9f6d737637828fca54e5eeb6d821ae25f044d not found: ID does not exist" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.879979 4766 scope.go:117] "RemoveContainer" containerID="3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57" Dec 09 04:58:59 crc kubenswrapper[4766]: E1209 04:58:59.883324 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57\": container with ID starting with 3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57 not found: ID does not exist" containerID="3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.883372 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57"} err="failed to get container status \"3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57\": rpc error: code = NotFound desc = could not find container \"3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57\": container with ID starting with 3932b474979508de9f4debad0691930a0c42bf4b02c6b9a94858879667340b57 not found: ID does not exist" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.883409 4766 scope.go:117] "RemoveContainer" containerID="cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5" Dec 09 04:58:59 crc kubenswrapper[4766]: E1209 04:58:59.884078 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5\": container with ID starting with cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5 not found: ID does not exist" containerID="cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5" Dec 09 04:58:59 crc kubenswrapper[4766]: I1209 04:58:59.884113 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5"} err="failed to get container status \"cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5\": rpc error: code = NotFound desc = could not find container \"cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5\": container with ID starting with cc486cf214a6c4ccb070d7dd3a2faabad44213dfd7a3b47f862e3b2edde080b5 not found: ID does not exist" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.643281 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-r2zqs"] Dec 09 04:59:00 crc kubenswrapper[4766]: E1209 04:59:00.643976 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerName="registry-server" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.643992 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerName="registry-server" Dec 09 04:59:00 crc kubenswrapper[4766]: E1209 04:59:00.644003 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerName="extract-content" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.644009 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerName="extract-content" Dec 09 04:59:00 crc kubenswrapper[4766]: E1209 04:59:00.644023 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerName="extract-utilities" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.644029 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerName="extract-utilities" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.644274 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f9297b-eee0-4e28-ba93-e13e8128be16" containerName="registry-server" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.645020 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.682726 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-r2zqs"] Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.742391 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86805adc-8867-42c3-969a-9c23a0f59e17-operator-scripts\") pod \"manila-db-create-r2zqs\" (UID: \"86805adc-8867-42c3-969a-9c23a0f59e17\") " pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.742490 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnt2k\" (UniqueName: \"kubernetes.io/projected/86805adc-8867-42c3-969a-9c23a0f59e17-kube-api-access-hnt2k\") pod \"manila-db-create-r2zqs\" (UID: \"86805adc-8867-42c3-969a-9c23a0f59e17\") " pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.746067 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerStarted","Data":"a547c389b4a655fe19324728e7f9c22ad279cc356bb9d6be930557d75659f217"} Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.746279 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.772845 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.493543768 podStartE2EDuration="5.772828725s" podCreationTimestamp="2025-12-09 04:58:55 +0000 UTC" firstStartedPulling="2025-12-09 04:58:56.63677146 +0000 UTC m=+6418.346076876" lastFinishedPulling="2025-12-09 04:58:59.916056407 +0000 UTC m=+6421.625361833" observedRunningTime="2025-12-09 04:59:00.764774227 +0000 UTC m=+6422.474079653" watchObservedRunningTime="2025-12-09 04:59:00.772828725 +0000 UTC m=+6422.482134151" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.845683 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86805adc-8867-42c3-969a-9c23a0f59e17-operator-scripts\") pod \"manila-db-create-r2zqs\" (UID: \"86805adc-8867-42c3-969a-9c23a0f59e17\") " pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.845903 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnt2k\" (UniqueName: \"kubernetes.io/projected/86805adc-8867-42c3-969a-9c23a0f59e17-kube-api-access-hnt2k\") pod \"manila-db-create-r2zqs\" (UID: \"86805adc-8867-42c3-969a-9c23a0f59e17\") " pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.847851 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86805adc-8867-42c3-969a-9c23a0f59e17-operator-scripts\") pod \"manila-db-create-r2zqs\" (UID: \"86805adc-8867-42c3-969a-9c23a0f59e17\") " pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.856527 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f9297b-eee0-4e28-ba93-e13e8128be16" path="/var/lib/kubelet/pods/89f9297b-eee0-4e28-ba93-e13e8128be16/volumes" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.858536 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-58d2-account-create-update-hwmhp"] Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.860163 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.863174 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.870205 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnt2k\" (UniqueName: \"kubernetes.io/projected/86805adc-8867-42c3-969a-9c23a0f59e17-kube-api-access-hnt2k\") pod \"manila-db-create-r2zqs\" (UID: \"86805adc-8867-42c3-969a-9c23a0f59e17\") " pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.873041 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-58d2-account-create-update-hwmhp"] Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.947993 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hdjm\" (UniqueName: \"kubernetes.io/projected/a0e7d30c-dd55-4878-bfd6-9916479661c4-kube-api-access-4hdjm\") pod \"manila-58d2-account-create-update-hwmhp\" (UID: \"a0e7d30c-dd55-4878-bfd6-9916479661c4\") " pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:00 crc kubenswrapper[4766]: I1209 04:59:00.948420 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e7d30c-dd55-4878-bfd6-9916479661c4-operator-scripts\") pod \"manila-58d2-account-create-update-hwmhp\" (UID: \"a0e7d30c-dd55-4878-bfd6-9916479661c4\") " pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:01 crc kubenswrapper[4766]: I1209 04:59:01.005735 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:01 crc kubenswrapper[4766]: I1209 04:59:01.049998 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e7d30c-dd55-4878-bfd6-9916479661c4-operator-scripts\") pod \"manila-58d2-account-create-update-hwmhp\" (UID: \"a0e7d30c-dd55-4878-bfd6-9916479661c4\") " pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:01 crc kubenswrapper[4766]: I1209 04:59:01.050850 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e7d30c-dd55-4878-bfd6-9916479661c4-operator-scripts\") pod \"manila-58d2-account-create-update-hwmhp\" (UID: \"a0e7d30c-dd55-4878-bfd6-9916479661c4\") " pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:01 crc kubenswrapper[4766]: I1209 04:59:01.051098 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hdjm\" (UniqueName: \"kubernetes.io/projected/a0e7d30c-dd55-4878-bfd6-9916479661c4-kube-api-access-4hdjm\") pod \"manila-58d2-account-create-update-hwmhp\" (UID: \"a0e7d30c-dd55-4878-bfd6-9916479661c4\") " pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:01 crc kubenswrapper[4766]: I1209 04:59:01.072884 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hdjm\" (UniqueName: \"kubernetes.io/projected/a0e7d30c-dd55-4878-bfd6-9916479661c4-kube-api-access-4hdjm\") pod \"manila-58d2-account-create-update-hwmhp\" (UID: \"a0e7d30c-dd55-4878-bfd6-9916479661c4\") " pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:01 crc kubenswrapper[4766]: I1209 04:59:01.248977 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:01 crc kubenswrapper[4766]: I1209 04:59:01.578702 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-r2zqs"] Dec 09 04:59:01 crc kubenswrapper[4766]: W1209 04:59:01.585885 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86805adc_8867_42c3_969a_9c23a0f59e17.slice/crio-ff2ec1c9e049c8765c17227098f796030b2c0655050b751aa2c92d2592da1e10 WatchSource:0}: Error finding container ff2ec1c9e049c8765c17227098f796030b2c0655050b751aa2c92d2592da1e10: Status 404 returned error can't find the container with id ff2ec1c9e049c8765c17227098f796030b2c0655050b751aa2c92d2592da1e10 Dec 09 04:59:01 crc kubenswrapper[4766]: I1209 04:59:01.765432 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-r2zqs" event={"ID":"86805adc-8867-42c3-969a-9c23a0f59e17","Type":"ContainerStarted","Data":"ff2ec1c9e049c8765c17227098f796030b2c0655050b751aa2c92d2592da1e10"} Dec 09 04:59:01 crc kubenswrapper[4766]: I1209 04:59:01.792467 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-58d2-account-create-update-hwmhp"] Dec 09 04:59:01 crc kubenswrapper[4766]: W1209 04:59:01.801662 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0e7d30c_dd55_4878_bfd6_9916479661c4.slice/crio-7bedf07792e237fd06759442bb70be2a17291798840fd908375b5eb2813e55e0 WatchSource:0}: Error finding container 7bedf07792e237fd06759442bb70be2a17291798840fd908375b5eb2813e55e0: Status 404 returned error can't find the container with id 7bedf07792e237fd06759442bb70be2a17291798840fd908375b5eb2813e55e0 Dec 09 04:59:02 crc kubenswrapper[4766]: I1209 04:59:02.779755 4766 generic.go:334] "Generic (PLEG): container finished" podID="86805adc-8867-42c3-969a-9c23a0f59e17" containerID="b281ecd66eeb9d569a8f2727632b756707525dd6b269a53cef9bcf46e62ec12b" exitCode=0 Dec 09 04:59:02 crc kubenswrapper[4766]: I1209 04:59:02.779856 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-r2zqs" event={"ID":"86805adc-8867-42c3-969a-9c23a0f59e17","Type":"ContainerDied","Data":"b281ecd66eeb9d569a8f2727632b756707525dd6b269a53cef9bcf46e62ec12b"} Dec 09 04:59:02 crc kubenswrapper[4766]: I1209 04:59:02.782554 4766 generic.go:334] "Generic (PLEG): container finished" podID="a0e7d30c-dd55-4878-bfd6-9916479661c4" containerID="cef5bb8de4608acd6783f4e83886ca0c6b7a4a201e32898a826f5cf76916c954" exitCode=0 Dec 09 04:59:02 crc kubenswrapper[4766]: I1209 04:59:02.782591 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-58d2-account-create-update-hwmhp" event={"ID":"a0e7d30c-dd55-4878-bfd6-9916479661c4","Type":"ContainerDied","Data":"cef5bb8de4608acd6783f4e83886ca0c6b7a4a201e32898a826f5cf76916c954"} Dec 09 04:59:02 crc kubenswrapper[4766]: I1209 04:59:02.782608 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-58d2-account-create-update-hwmhp" event={"ID":"a0e7d30c-dd55-4878-bfd6-9916479661c4","Type":"ContainerStarted","Data":"7bedf07792e237fd06759442bb70be2a17291798840fd908375b5eb2813e55e0"} Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.483062 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.491920 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.537854 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e7d30c-dd55-4878-bfd6-9916479661c4-operator-scripts\") pod \"a0e7d30c-dd55-4878-bfd6-9916479661c4\" (UID: \"a0e7d30c-dd55-4878-bfd6-9916479661c4\") " Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.538118 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86805adc-8867-42c3-969a-9c23a0f59e17-operator-scripts\") pod \"86805adc-8867-42c3-969a-9c23a0f59e17\" (UID: \"86805adc-8867-42c3-969a-9c23a0f59e17\") " Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.538233 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hdjm\" (UniqueName: \"kubernetes.io/projected/a0e7d30c-dd55-4878-bfd6-9916479661c4-kube-api-access-4hdjm\") pod \"a0e7d30c-dd55-4878-bfd6-9916479661c4\" (UID: \"a0e7d30c-dd55-4878-bfd6-9916479661c4\") " Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.538343 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnt2k\" (UniqueName: \"kubernetes.io/projected/86805adc-8867-42c3-969a-9c23a0f59e17-kube-api-access-hnt2k\") pod \"86805adc-8867-42c3-969a-9c23a0f59e17\" (UID: \"86805adc-8867-42c3-969a-9c23a0f59e17\") " Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.538485 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e7d30c-dd55-4878-bfd6-9916479661c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0e7d30c-dd55-4878-bfd6-9916479661c4" (UID: "a0e7d30c-dd55-4878-bfd6-9916479661c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.539152 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e7d30c-dd55-4878-bfd6-9916479661c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.540029 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86805adc-8867-42c3-969a-9c23a0f59e17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86805adc-8867-42c3-969a-9c23a0f59e17" (UID: "86805adc-8867-42c3-969a-9c23a0f59e17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.549288 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86805adc-8867-42c3-969a-9c23a0f59e17-kube-api-access-hnt2k" (OuterVolumeSpecName: "kube-api-access-hnt2k") pod "86805adc-8867-42c3-969a-9c23a0f59e17" (UID: "86805adc-8867-42c3-969a-9c23a0f59e17"). InnerVolumeSpecName "kube-api-access-hnt2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.549539 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e7d30c-dd55-4878-bfd6-9916479661c4-kube-api-access-4hdjm" (OuterVolumeSpecName: "kube-api-access-4hdjm") pod "a0e7d30c-dd55-4878-bfd6-9916479661c4" (UID: "a0e7d30c-dd55-4878-bfd6-9916479661c4"). InnerVolumeSpecName "kube-api-access-4hdjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.642013 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86805adc-8867-42c3-969a-9c23a0f59e17-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.642053 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hdjm\" (UniqueName: \"kubernetes.io/projected/a0e7d30c-dd55-4878-bfd6-9916479661c4-kube-api-access-4hdjm\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.642068 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnt2k\" (UniqueName: \"kubernetes.io/projected/86805adc-8867-42c3-969a-9c23a0f59e17-kube-api-access-hnt2k\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.813049 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-58d2-account-create-update-hwmhp" event={"ID":"a0e7d30c-dd55-4878-bfd6-9916479661c4","Type":"ContainerDied","Data":"7bedf07792e237fd06759442bb70be2a17291798840fd908375b5eb2813e55e0"} Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.813222 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bedf07792e237fd06759442bb70be2a17291798840fd908375b5eb2813e55e0" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.815313 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-58d2-account-create-update-hwmhp" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.822008 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-r2zqs" event={"ID":"86805adc-8867-42c3-969a-9c23a0f59e17","Type":"ContainerDied","Data":"ff2ec1c9e049c8765c17227098f796030b2c0655050b751aa2c92d2592da1e10"} Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.822048 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff2ec1c9e049c8765c17227098f796030b2c0655050b751aa2c92d2592da1e10" Dec 09 04:59:04 crc kubenswrapper[4766]: I1209 04:59:04.822143 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-r2zqs" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.151448 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-hz65c"] Dec 09 04:59:06 crc kubenswrapper[4766]: E1209 04:59:06.152861 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e7d30c-dd55-4878-bfd6-9916479661c4" containerName="mariadb-account-create-update" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.152899 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e7d30c-dd55-4878-bfd6-9916479661c4" containerName="mariadb-account-create-update" Dec 09 04:59:06 crc kubenswrapper[4766]: E1209 04:59:06.152942 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86805adc-8867-42c3-969a-9c23a0f59e17" containerName="mariadb-database-create" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.152960 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="86805adc-8867-42c3-969a-9c23a0f59e17" containerName="mariadb-database-create" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.153366 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e7d30c-dd55-4878-bfd6-9916479661c4" containerName="mariadb-account-create-update" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.153398 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="86805adc-8867-42c3-969a-9c23a0f59e17" containerName="mariadb-database-create" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.154320 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.158258 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-vt5kq" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.158696 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.174716 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-hz65c"] Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.179150 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-job-config-data\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.179289 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99nf5\" (UniqueName: \"kubernetes.io/projected/501e3fae-1369-4582-b9a5-af19e30ad540-kube-api-access-99nf5\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.179366 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-config-data\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.179414 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-combined-ca-bundle\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.281262 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99nf5\" (UniqueName: \"kubernetes.io/projected/501e3fae-1369-4582-b9a5-af19e30ad540-kube-api-access-99nf5\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.281702 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-config-data\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.281796 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-combined-ca-bundle\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.281927 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-job-config-data\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.288642 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-job-config-data\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.289164 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-config-data\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.300020 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-combined-ca-bundle\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.309896 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99nf5\" (UniqueName: \"kubernetes.io/projected/501e3fae-1369-4582-b9a5-af19e30ad540-kube-api-access-99nf5\") pod \"manila-db-sync-hz65c\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:06 crc kubenswrapper[4766]: I1209 04:59:06.499384 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:07 crc kubenswrapper[4766]: I1209 04:59:07.259884 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-hz65c"] Dec 09 04:59:07 crc kubenswrapper[4766]: W1209 04:59:07.278969 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod501e3fae_1369_4582_b9a5_af19e30ad540.slice/crio-c5a98db1d42ea06b2eaf250add6266a00d96457aed520a40c0d9365edede1d1c WatchSource:0}: Error finding container c5a98db1d42ea06b2eaf250add6266a00d96457aed520a40c0d9365edede1d1c: Status 404 returned error can't find the container with id c5a98db1d42ea06b2eaf250add6266a00d96457aed520a40c0d9365edede1d1c Dec 09 04:59:07 crc kubenswrapper[4766]: I1209 04:59:07.857250 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-hz65c" event={"ID":"501e3fae-1369-4582-b9a5-af19e30ad540","Type":"ContainerStarted","Data":"c5a98db1d42ea06b2eaf250add6266a00d96457aed520a40c0d9365edede1d1c"} Dec 09 04:59:09 crc kubenswrapper[4766]: I1209 04:59:09.840195 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:59:09 crc kubenswrapper[4766]: E1209 04:59:09.840815 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:59:12 crc kubenswrapper[4766]: I1209 04:59:12.922303 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-hz65c" event={"ID":"501e3fae-1369-4582-b9a5-af19e30ad540","Type":"ContainerStarted","Data":"b8190f6d1d9dc2c569155f804e6bf83370fda4966188c67cfc7705adbe223fa2"} Dec 09 04:59:12 crc kubenswrapper[4766]: I1209 04:59:12.947814 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-hz65c" podStartSLOduration=2.871023012 podStartE2EDuration="6.947790364s" podCreationTimestamp="2025-12-09 04:59:06 +0000 UTC" firstStartedPulling="2025-12-09 04:59:07.282029293 +0000 UTC m=+6428.991334719" lastFinishedPulling="2025-12-09 04:59:11.358796645 +0000 UTC m=+6433.068102071" observedRunningTime="2025-12-09 04:59:12.944281749 +0000 UTC m=+6434.653587245" watchObservedRunningTime="2025-12-09 04:59:12.947790364 +0000 UTC m=+6434.657095830" Dec 09 04:59:13 crc kubenswrapper[4766]: I1209 04:59:13.937521 4766 generic.go:334] "Generic (PLEG): container finished" podID="501e3fae-1369-4582-b9a5-af19e30ad540" containerID="b8190f6d1d9dc2c569155f804e6bf83370fda4966188c67cfc7705adbe223fa2" exitCode=0 Dec 09 04:59:13 crc kubenswrapper[4766]: I1209 04:59:13.937701 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-hz65c" event={"ID":"501e3fae-1369-4582-b9a5-af19e30ad540","Type":"ContainerDied","Data":"b8190f6d1d9dc2c569155f804e6bf83370fda4966188c67cfc7705adbe223fa2"} Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.455442 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.604286 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-combined-ca-bundle\") pod \"501e3fae-1369-4582-b9a5-af19e30ad540\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.604486 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-job-config-data\") pod \"501e3fae-1369-4582-b9a5-af19e30ad540\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.604513 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-config-data\") pod \"501e3fae-1369-4582-b9a5-af19e30ad540\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.604599 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99nf5\" (UniqueName: \"kubernetes.io/projected/501e3fae-1369-4582-b9a5-af19e30ad540-kube-api-access-99nf5\") pod \"501e3fae-1369-4582-b9a5-af19e30ad540\" (UID: \"501e3fae-1369-4582-b9a5-af19e30ad540\") " Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.609611 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "501e3fae-1369-4582-b9a5-af19e30ad540" (UID: "501e3fae-1369-4582-b9a5-af19e30ad540"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.609824 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501e3fae-1369-4582-b9a5-af19e30ad540-kube-api-access-99nf5" (OuterVolumeSpecName: "kube-api-access-99nf5") pod "501e3fae-1369-4582-b9a5-af19e30ad540" (UID: "501e3fae-1369-4582-b9a5-af19e30ad540"). InnerVolumeSpecName "kube-api-access-99nf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.612643 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-config-data" (OuterVolumeSpecName: "config-data") pod "501e3fae-1369-4582-b9a5-af19e30ad540" (UID: "501e3fae-1369-4582-b9a5-af19e30ad540"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.635608 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "501e3fae-1369-4582-b9a5-af19e30ad540" (UID: "501e3fae-1369-4582-b9a5-af19e30ad540"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.707025 4766 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.707062 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.707078 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99nf5\" (UniqueName: \"kubernetes.io/projected/501e3fae-1369-4582-b9a5-af19e30ad540-kube-api-access-99nf5\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.707091 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501e3fae-1369-4582-b9a5-af19e30ad540-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.956584 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-hz65c" event={"ID":"501e3fae-1369-4582-b9a5-af19e30ad540","Type":"ContainerDied","Data":"c5a98db1d42ea06b2eaf250add6266a00d96457aed520a40c0d9365edede1d1c"} Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.956628 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a98db1d42ea06b2eaf250add6266a00d96457aed520a40c0d9365edede1d1c" Dec 09 04:59:15 crc kubenswrapper[4766]: I1209 04:59:15.956696 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-hz65c" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.421852 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 09 04:59:16 crc kubenswrapper[4766]: E1209 04:59:16.422589 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501e3fae-1369-4582-b9a5-af19e30ad540" containerName="manila-db-sync" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.422608 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="501e3fae-1369-4582-b9a5-af19e30ad540" containerName="manila-db-sync" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.422855 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="501e3fae-1369-4582-b9a5-af19e30ad540" containerName="manila-db-sync" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.427314 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.430543 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.436896 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.437240 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-vt5kq" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.437500 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.449231 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.485355 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.488265 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.492990 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.525386 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.585145 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-574fd486b5-dr8gv"] Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.587379 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.599620 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-574fd486b5-dr8gv"] Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626601 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27pt\" (UniqueName: \"kubernetes.io/projected/6458ca19-4210-498d-8022-305da98f2544-kube-api-access-k27pt\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626650 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626675 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4p5b\" (UniqueName: \"kubernetes.io/projected/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-kube-api-access-m4p5b\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626697 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-config-data\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626716 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-scripts\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626738 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-config-data\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626778 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6458ca19-4210-498d-8022-305da98f2544-ceph\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626796 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626828 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6458ca19-4210-498d-8022-305da98f2544-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626849 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-scripts\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626934 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.626966 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.627003 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6458ca19-4210-498d-8022-305da98f2544-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.627028 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.663806 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.669494 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.673283 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.678942 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.728223 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-dns-svc\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.729817 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.729870 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.729922 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6458ca19-4210-498d-8022-305da98f2544-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.729952 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.729981 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-sb\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730014 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k27pt\" (UniqueName: \"kubernetes.io/projected/6458ca19-4210-498d-8022-305da98f2544-kube-api-access-k27pt\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730031 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730053 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4p5b\" (UniqueName: \"kubernetes.io/projected/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-kube-api-access-m4p5b\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730075 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-config\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730099 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-config-data\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730116 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-scripts\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730137 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-config-data\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730180 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4sn8\" (UniqueName: \"kubernetes.io/projected/5a45af36-95fe-48bb-9711-fbe9f418eaa4-kube-api-access-r4sn8\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730205 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6458ca19-4210-498d-8022-305da98f2544-ceph\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730245 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730285 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6458ca19-4210-498d-8022-305da98f2544-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730301 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-scripts\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730338 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-nb\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.730661 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.733342 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6458ca19-4210-498d-8022-305da98f2544-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.733549 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6458ca19-4210-498d-8022-305da98f2544-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.735675 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.738860 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-scripts\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.739981 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-config-data\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.740838 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.741817 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.742599 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6458ca19-4210-498d-8022-305da98f2544-ceph\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.742754 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-scripts\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.750826 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27pt\" (UniqueName: \"kubernetes.io/projected/6458ca19-4210-498d-8022-305da98f2544-kube-api-access-k27pt\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.758419 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6458ca19-4210-498d-8022-305da98f2544-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6458ca19-4210-498d-8022-305da98f2544\") " pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.761824 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4p5b\" (UniqueName: \"kubernetes.io/projected/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-kube-api-access-m4p5b\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.762095 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a74bac3-3464-4c09-bec4-bd0d2e0e88f9-config-data\") pod \"manila-scheduler-0\" (UID: \"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9\") " pod="openstack/manila-scheduler-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.823010 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831437 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4sn8\" (UniqueName: \"kubernetes.io/projected/5a45af36-95fe-48bb-9711-fbe9f418eaa4-kube-api-access-r4sn8\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831480 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831505 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-scripts\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831556 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-nb\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831583 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-config-data-custom\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831599 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-config-data\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831626 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b487089-eff1-45b3-b892-26574d04acfe-etc-machine-id\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831663 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-dns-svc\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831682 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b487089-eff1-45b3-b892-26574d04acfe-logs\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831751 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42q4t\" (UniqueName: \"kubernetes.io/projected/8b487089-eff1-45b3-b892-26574d04acfe-kube-api-access-42q4t\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831774 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-sb\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.831806 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-config\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.832663 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-config\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.832747 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-dns-svc\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.833160 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-sb\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.834774 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-nb\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.878017 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4sn8\" (UniqueName: \"kubernetes.io/projected/5a45af36-95fe-48bb-9711-fbe9f418eaa4-kube-api-access-r4sn8\") pod \"dnsmasq-dns-574fd486b5-dr8gv\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.907705 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.932852 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b487089-eff1-45b3-b892-26574d04acfe-etc-machine-id\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.932921 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b487089-eff1-45b3-b892-26574d04acfe-logs\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.932996 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42q4t\" (UniqueName: \"kubernetes.io/projected/8b487089-eff1-45b3-b892-26574d04acfe-kube-api-access-42q4t\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.933057 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.933076 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-scripts\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.933125 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-config-data-custom\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.933141 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-config-data\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.933855 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b487089-eff1-45b3-b892-26574d04acfe-etc-machine-id\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.934168 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b487089-eff1-45b3-b892-26574d04acfe-logs\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.937720 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-config-data\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.938681 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-scripts\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.941988 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-config-data-custom\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.942393 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b487089-eff1-45b3-b892-26574d04acfe-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.958878 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42q4t\" (UniqueName: \"kubernetes.io/projected/8b487089-eff1-45b3-b892-26574d04acfe-kube-api-access-42q4t\") pod \"manila-api-0\" (UID: \"8b487089-eff1-45b3-b892-26574d04acfe\") " pod="openstack/manila-api-0" Dec 09 04:59:16 crc kubenswrapper[4766]: I1209 04:59:16.997499 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 09 04:59:17 crc kubenswrapper[4766]: I1209 04:59:17.047855 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 09 04:59:17 crc kubenswrapper[4766]: W1209 04:59:17.488355 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6458ca19_4210_498d_8022_305da98f2544.slice/crio-40d2d84086778ad81e5c1a9b2870e0a79f70d3033535ad238ee932e71cf7aa4d WatchSource:0}: Error finding container 40d2d84086778ad81e5c1a9b2870e0a79f70d3033535ad238ee932e71cf7aa4d: Status 404 returned error can't find the container with id 40d2d84086778ad81e5c1a9b2870e0a79f70d3033535ad238ee932e71cf7aa4d Dec 09 04:59:17 crc kubenswrapper[4766]: I1209 04:59:17.492001 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 09 04:59:17 crc kubenswrapper[4766]: I1209 04:59:17.515257 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-574fd486b5-dr8gv"] Dec 09 04:59:17 crc kubenswrapper[4766]: W1209 04:59:17.517597 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a45af36_95fe_48bb_9711_fbe9f418eaa4.slice/crio-33f533f2ebe2eb1c391fc368c3b07b6f3d1953a621cadf36c52dae915cfa1ada WatchSource:0}: Error finding container 33f533f2ebe2eb1c391fc368c3b07b6f3d1953a621cadf36c52dae915cfa1ada: Status 404 returned error can't find the container with id 33f533f2ebe2eb1c391fc368c3b07b6f3d1953a621cadf36c52dae915cfa1ada Dec 09 04:59:17 crc kubenswrapper[4766]: I1209 04:59:17.751831 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 09 04:59:17 crc kubenswrapper[4766]: W1209 04:59:17.814648 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a74bac3_3464_4c09_bec4_bd0d2e0e88f9.slice/crio-44a522683e99b7dbb922df47d8a39f104d040c92faa04235631a81845b2000f9 WatchSource:0}: Error finding container 44a522683e99b7dbb922df47d8a39f104d040c92faa04235631a81845b2000f9: Status 404 returned error can't find the container with id 44a522683e99b7dbb922df47d8a39f104d040c92faa04235631a81845b2000f9 Dec 09 04:59:17 crc kubenswrapper[4766]: I1209 04:59:17.939632 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 09 04:59:17 crc kubenswrapper[4766]: W1209 04:59:17.959173 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b487089_eff1_45b3_b892_26574d04acfe.slice/crio-a27db96a576eff619164024de8ab9471e5683dc0b412e6c8877844983df85c0c WatchSource:0}: Error finding container a27db96a576eff619164024de8ab9471e5683dc0b412e6c8877844983df85c0c: Status 404 returned error can't find the container with id a27db96a576eff619164024de8ab9471e5683dc0b412e6c8877844983df85c0c Dec 09 04:59:17 crc kubenswrapper[4766]: I1209 04:59:17.984612 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9","Type":"ContainerStarted","Data":"44a522683e99b7dbb922df47d8a39f104d040c92faa04235631a81845b2000f9"} Dec 09 04:59:17 crc kubenswrapper[4766]: I1209 04:59:17.985545 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"8b487089-eff1-45b3-b892-26574d04acfe","Type":"ContainerStarted","Data":"a27db96a576eff619164024de8ab9471e5683dc0b412e6c8877844983df85c0c"} Dec 09 04:59:17 crc kubenswrapper[4766]: I1209 04:59:17.986292 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6458ca19-4210-498d-8022-305da98f2544","Type":"ContainerStarted","Data":"40d2d84086778ad81e5c1a9b2870e0a79f70d3033535ad238ee932e71cf7aa4d"} Dec 09 04:59:17 crc kubenswrapper[4766]: I1209 04:59:17.987776 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" event={"ID":"5a45af36-95fe-48bb-9711-fbe9f418eaa4","Type":"ContainerStarted","Data":"33f533f2ebe2eb1c391fc368c3b07b6f3d1953a621cadf36c52dae915cfa1ada"} Dec 09 04:59:19 crc kubenswrapper[4766]: I1209 04:59:19.036303 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"8b487089-eff1-45b3-b892-26574d04acfe","Type":"ContainerStarted","Data":"3f96862cafa1a8f9661547eb1ef5f38e6b12668a3aa73b864b3b6f1e44a5b938"} Dec 09 04:59:19 crc kubenswrapper[4766]: I1209 04:59:19.036879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"8b487089-eff1-45b3-b892-26574d04acfe","Type":"ContainerStarted","Data":"19be5d74c1ba014312ae352133fe0723b5f3bb9438b9ba8b31db8e65cb0c3db0"} Dec 09 04:59:19 crc kubenswrapper[4766]: I1209 04:59:19.036960 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 09 04:59:19 crc kubenswrapper[4766]: I1209 04:59:19.042842 4766 generic.go:334] "Generic (PLEG): container finished" podID="5a45af36-95fe-48bb-9711-fbe9f418eaa4" containerID="78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f" exitCode=0 Dec 09 04:59:19 crc kubenswrapper[4766]: I1209 04:59:19.042896 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" event={"ID":"5a45af36-95fe-48bb-9711-fbe9f418eaa4","Type":"ContainerDied","Data":"78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f"} Dec 09 04:59:19 crc kubenswrapper[4766]: I1209 04:59:19.065477 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9","Type":"ContainerStarted","Data":"1577ff58a198b8eb0893f246376ecce28155a544d1ed41adea9a7c83bbcca2b5"} Dec 09 04:59:19 crc kubenswrapper[4766]: I1209 04:59:19.082987 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.082963557 podStartE2EDuration="3.082963557s" podCreationTimestamp="2025-12-09 04:59:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:59:19.066248535 +0000 UTC m=+6440.775553961" watchObservedRunningTime="2025-12-09 04:59:19.082963557 +0000 UTC m=+6440.792268983" Dec 09 04:59:20 crc kubenswrapper[4766]: I1209 04:59:20.078657 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" event={"ID":"5a45af36-95fe-48bb-9711-fbe9f418eaa4","Type":"ContainerStarted","Data":"eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2"} Dec 09 04:59:20 crc kubenswrapper[4766]: I1209 04:59:20.079121 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:20 crc kubenswrapper[4766]: I1209 04:59:20.085522 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8a74bac3-3464-4c09-bec4-bd0d2e0e88f9","Type":"ContainerStarted","Data":"daab7a427e52ba572e4cb31733b4aae69c06f6e5b5e8dd68f8fe37a90fe912ab"} Dec 09 04:59:20 crc kubenswrapper[4766]: I1209 04:59:20.110256 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" podStartSLOduration=4.110233026 podStartE2EDuration="4.110233026s" podCreationTimestamp="2025-12-09 04:59:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 04:59:20.097443449 +0000 UTC m=+6441.806748875" watchObservedRunningTime="2025-12-09 04:59:20.110233026 +0000 UTC m=+6441.819538452" Dec 09 04:59:20 crc kubenswrapper[4766]: I1209 04:59:20.123836 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.494587147 podStartE2EDuration="4.123814912s" podCreationTimestamp="2025-12-09 04:59:16 +0000 UTC" firstStartedPulling="2025-12-09 04:59:17.832804711 +0000 UTC m=+6439.542110137" lastFinishedPulling="2025-12-09 04:59:18.462032476 +0000 UTC m=+6440.171337902" observedRunningTime="2025-12-09 04:59:20.115109397 +0000 UTC m=+6441.824414833" watchObservedRunningTime="2025-12-09 04:59:20.123814912 +0000 UTC m=+6441.833120358" Dec 09 04:59:20 crc kubenswrapper[4766]: I1209 04:59:20.839003 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:59:20 crc kubenswrapper[4766]: E1209 04:59:20.839230 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:59:23 crc kubenswrapper[4766]: I1209 04:59:23.517390 4766 scope.go:117] "RemoveContainer" containerID="c69943f6e8283184ae7adbf4306bffabe68397b01a656d285816eca3b5b91910" Dec 09 04:59:23 crc kubenswrapper[4766]: I1209 04:59:23.940188 4766 scope.go:117] "RemoveContainer" containerID="23bc3bba525e3a270ad6e8fcfb6b09a289c9151c273df598f32432e5427f8ff9" Dec 09 04:59:24 crc kubenswrapper[4766]: I1209 04:59:24.007799 4766 scope.go:117] "RemoveContainer" containerID="3ff4eb02b86ecea7771b14f831fe75e24e15cf835687f22ddca2f7eb7fc1606f" Dec 09 04:59:24 crc kubenswrapper[4766]: I1209 04:59:24.207275 4766 scope.go:117] "RemoveContainer" containerID="e5441e9b89d5e07ae1c7cf467200cbf853701173ca108861f591f5d700a5c499" Dec 09 04:59:24 crc kubenswrapper[4766]: I1209 04:59:24.249314 4766 scope.go:117] "RemoveContainer" containerID="7a606ef6a26c059612d4536fbcc43763558f4d1f249a3613caa872f83e394901" Dec 09 04:59:25 crc kubenswrapper[4766]: I1209 04:59:25.162312 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6458ca19-4210-498d-8022-305da98f2544","Type":"ContainerStarted","Data":"f1c596b3cbbf4c704ad7c396faf8baf521fc11d609e6e254a126d917d08ddff0"} Dec 09 04:59:25 crc kubenswrapper[4766]: I1209 04:59:25.162600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6458ca19-4210-498d-8022-305da98f2544","Type":"ContainerStarted","Data":"b9ef98b570a9172983ff9f6c908e27cbf1da56804766bb58fc97b02a80163ae5"} Dec 09 04:59:25 crc kubenswrapper[4766]: I1209 04:59:25.197801 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.677867603 podStartE2EDuration="9.197783109s" podCreationTimestamp="2025-12-09 04:59:16 +0000 UTC" firstStartedPulling="2025-12-09 04:59:17.490725621 +0000 UTC m=+6439.200031047" lastFinishedPulling="2025-12-09 04:59:24.010641127 +0000 UTC m=+6445.719946553" observedRunningTime="2025-12-09 04:59:25.180133601 +0000 UTC m=+6446.889439067" watchObservedRunningTime="2025-12-09 04:59:25.197783109 +0000 UTC m=+6446.907088545" Dec 09 04:59:26 crc kubenswrapper[4766]: I1209 04:59:26.208871 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 04:59:26 crc kubenswrapper[4766]: I1209 04:59:26.824424 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 09 04:59:26 crc kubenswrapper[4766]: I1209 04:59:26.909164 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 04:59:26 crc kubenswrapper[4766]: I1209 04:59:26.989749 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b67fc789-stj2f"] Dec 09 04:59:26 crc kubenswrapper[4766]: I1209 04:59:26.990030 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" podUID="c7dcac32-7e95-4025-8aee-a985b13b52ef" containerName="dnsmasq-dns" containerID="cri-o://6fc6b3ef12267c15d96698b8b2c4282895962bbd8a89ebd7e75b696ada87f8a8" gracePeriod=10 Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.049418 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.208425 4766 generic.go:334] "Generic (PLEG): container finished" podID="c7dcac32-7e95-4025-8aee-a985b13b52ef" containerID="6fc6b3ef12267c15d96698b8b2c4282895962bbd8a89ebd7e75b696ada87f8a8" exitCode=0 Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.208491 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" event={"ID":"c7dcac32-7e95-4025-8aee-a985b13b52ef","Type":"ContainerDied","Data":"6fc6b3ef12267c15d96698b8b2c4282895962bbd8a89ebd7e75b696ada87f8a8"} Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.596132 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.683951 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blslr\" (UniqueName: \"kubernetes.io/projected/c7dcac32-7e95-4025-8aee-a985b13b52ef-kube-api-access-blslr\") pod \"c7dcac32-7e95-4025-8aee-a985b13b52ef\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.684020 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-sb\") pod \"c7dcac32-7e95-4025-8aee-a985b13b52ef\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.684164 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-config\") pod \"c7dcac32-7e95-4025-8aee-a985b13b52ef\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.684185 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-dns-svc\") pod \"c7dcac32-7e95-4025-8aee-a985b13b52ef\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.684202 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-nb\") pod \"c7dcac32-7e95-4025-8aee-a985b13b52ef\" (UID: \"c7dcac32-7e95-4025-8aee-a985b13b52ef\") " Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.690869 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7dcac32-7e95-4025-8aee-a985b13b52ef-kube-api-access-blslr" (OuterVolumeSpecName: "kube-api-access-blslr") pod "c7dcac32-7e95-4025-8aee-a985b13b52ef" (UID: "c7dcac32-7e95-4025-8aee-a985b13b52ef"). InnerVolumeSpecName "kube-api-access-blslr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.741350 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7dcac32-7e95-4025-8aee-a985b13b52ef" (UID: "c7dcac32-7e95-4025-8aee-a985b13b52ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.750924 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7dcac32-7e95-4025-8aee-a985b13b52ef" (UID: "c7dcac32-7e95-4025-8aee-a985b13b52ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.752690 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c7dcac32-7e95-4025-8aee-a985b13b52ef" (UID: "c7dcac32-7e95-4025-8aee-a985b13b52ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.760121 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-config" (OuterVolumeSpecName: "config") pod "c7dcac32-7e95-4025-8aee-a985b13b52ef" (UID: "c7dcac32-7e95-4025-8aee-a985b13b52ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.785678 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blslr\" (UniqueName: \"kubernetes.io/projected/c7dcac32-7e95-4025-8aee-a985b13b52ef-kube-api-access-blslr\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.785717 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.785727 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-config\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.785735 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:27 crc kubenswrapper[4766]: I1209 04:59:27.785744 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7dcac32-7e95-4025-8aee-a985b13b52ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.221521 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" event={"ID":"c7dcac32-7e95-4025-8aee-a985b13b52ef","Type":"ContainerDied","Data":"adc475f353efcb811991a6406bbc06d8afee4b906b3ccc081ff70af4082b3268"} Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.221586 4766 scope.go:117] "RemoveContainer" containerID="6fc6b3ef12267c15d96698b8b2c4282895962bbd8a89ebd7e75b696ada87f8a8" Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.221929 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b67fc789-stj2f" Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.250089 4766 scope.go:117] "RemoveContainer" containerID="dfa46574415340b45d64ad78c3ff1899bbe4efa2724142cac8469bc7840fdbdb" Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.273180 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b67fc789-stj2f"] Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.285638 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57b67fc789-stj2f"] Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.315874 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.316743 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="ceilometer-central-agent" containerID="cri-o://cc9850683755f976f0bcf6f338252ab996a8d9955376018a5b48a5de13a663e9" gracePeriod=30 Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.316958 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="proxy-httpd" containerID="cri-o://a547c389b4a655fe19324728e7f9c22ad279cc356bb9d6be930557d75659f217" gracePeriod=30 Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.317201 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="ceilometer-notification-agent" containerID="cri-o://323919ccc4b2929763ab6b4f34106ddfac5c810d5c1736fe900eb2476cedcd53" gracePeriod=30 Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.317308 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="sg-core" containerID="cri-o://71507acc2593daebe497e2ad8fd906ccb2348c01a1c6389b199844058257f52b" gracePeriod=30 Dec 09 04:59:28 crc kubenswrapper[4766]: I1209 04:59:28.851565 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7dcac32-7e95-4025-8aee-a985b13b52ef" path="/var/lib/kubelet/pods/c7dcac32-7e95-4025-8aee-a985b13b52ef/volumes" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.237815 4766 generic.go:334] "Generic (PLEG): container finished" podID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerID="a547c389b4a655fe19324728e7f9c22ad279cc356bb9d6be930557d75659f217" exitCode=0 Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.237842 4766 generic.go:334] "Generic (PLEG): container finished" podID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerID="71507acc2593daebe497e2ad8fd906ccb2348c01a1c6389b199844058257f52b" exitCode=2 Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.237850 4766 generic.go:334] "Generic (PLEG): container finished" podID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerID="323919ccc4b2929763ab6b4f34106ddfac5c810d5c1736fe900eb2476cedcd53" exitCode=0 Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.237857 4766 generic.go:334] "Generic (PLEG): container finished" podID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerID="cc9850683755f976f0bcf6f338252ab996a8d9955376018a5b48a5de13a663e9" exitCode=0 Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.237891 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerDied","Data":"a547c389b4a655fe19324728e7f9c22ad279cc356bb9d6be930557d75659f217"} Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.237915 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerDied","Data":"71507acc2593daebe497e2ad8fd906ccb2348c01a1c6389b199844058257f52b"} Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.237926 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerDied","Data":"323919ccc4b2929763ab6b4f34106ddfac5c810d5c1736fe900eb2476cedcd53"} Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.237934 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerDied","Data":"cc9850683755f976f0bcf6f338252ab996a8d9955376018a5b48a5de13a663e9"} Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.618511 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.659313 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-combined-ca-bundle\") pod \"fc6adaf4-b846-4113-a338-1bfeccca7a82\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.659510 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-run-httpd\") pod \"fc6adaf4-b846-4113-a338-1bfeccca7a82\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.659594 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-scripts\") pod \"fc6adaf4-b846-4113-a338-1bfeccca7a82\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.659670 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-config-data\") pod \"fc6adaf4-b846-4113-a338-1bfeccca7a82\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.659698 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-sg-core-conf-yaml\") pod \"fc6adaf4-b846-4113-a338-1bfeccca7a82\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.659719 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kncfs\" (UniqueName: \"kubernetes.io/projected/fc6adaf4-b846-4113-a338-1bfeccca7a82-kube-api-access-kncfs\") pod \"fc6adaf4-b846-4113-a338-1bfeccca7a82\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.659739 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-log-httpd\") pod \"fc6adaf4-b846-4113-a338-1bfeccca7a82\" (UID: \"fc6adaf4-b846-4113-a338-1bfeccca7a82\") " Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.660676 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fc6adaf4-b846-4113-a338-1bfeccca7a82" (UID: "fc6adaf4-b846-4113-a338-1bfeccca7a82"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.667703 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fc6adaf4-b846-4113-a338-1bfeccca7a82" (UID: "fc6adaf4-b846-4113-a338-1bfeccca7a82"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.675371 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-scripts" (OuterVolumeSpecName: "scripts") pod "fc6adaf4-b846-4113-a338-1bfeccca7a82" (UID: "fc6adaf4-b846-4113-a338-1bfeccca7a82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.679687 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc6adaf4-b846-4113-a338-1bfeccca7a82-kube-api-access-kncfs" (OuterVolumeSpecName: "kube-api-access-kncfs") pod "fc6adaf4-b846-4113-a338-1bfeccca7a82" (UID: "fc6adaf4-b846-4113-a338-1bfeccca7a82"). InnerVolumeSpecName "kube-api-access-kncfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.716560 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fc6adaf4-b846-4113-a338-1bfeccca7a82" (UID: "fc6adaf4-b846-4113-a338-1bfeccca7a82"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.763139 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.763177 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-scripts\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.763186 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.763197 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kncfs\" (UniqueName: \"kubernetes.io/projected/fc6adaf4-b846-4113-a338-1bfeccca7a82-kube-api-access-kncfs\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.763226 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc6adaf4-b846-4113-a338-1bfeccca7a82-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.820798 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc6adaf4-b846-4113-a338-1bfeccca7a82" (UID: "fc6adaf4-b846-4113-a338-1bfeccca7a82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.840800 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-config-data" (OuterVolumeSpecName: "config-data") pod "fc6adaf4-b846-4113-a338-1bfeccca7a82" (UID: "fc6adaf4-b846-4113-a338-1bfeccca7a82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.864789 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:29 crc kubenswrapper[4766]: I1209 04:59:29.864828 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6adaf4-b846-4113-a338-1bfeccca7a82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.256296 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc6adaf4-b846-4113-a338-1bfeccca7a82","Type":"ContainerDied","Data":"77944959b05a60557f366a1ac3908a8d526fe6d2aaa7d538f785cf8027d73597"} Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.256674 4766 scope.go:117] "RemoveContainer" containerID="a547c389b4a655fe19324728e7f9c22ad279cc356bb9d6be930557d75659f217" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.256346 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.276882 4766 scope.go:117] "RemoveContainer" containerID="71507acc2593daebe497e2ad8fd906ccb2348c01a1c6389b199844058257f52b" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.293166 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.302226 4766 scope.go:117] "RemoveContainer" containerID="323919ccc4b2929763ab6b4f34106ddfac5c810d5c1736fe900eb2476cedcd53" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.312113 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.331026 4766 scope.go:117] "RemoveContainer" containerID="cc9850683755f976f0bcf6f338252ab996a8d9955376018a5b48a5de13a663e9" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.331186 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:59:30 crc kubenswrapper[4766]: E1209 04:59:30.331745 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="ceilometer-central-agent" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.331770 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="ceilometer-central-agent" Dec 09 04:59:30 crc kubenswrapper[4766]: E1209 04:59:30.331795 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dcac32-7e95-4025-8aee-a985b13b52ef" containerName="init" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.331804 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dcac32-7e95-4025-8aee-a985b13b52ef" containerName="init" Dec 09 04:59:30 crc kubenswrapper[4766]: E1209 04:59:30.331829 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="proxy-httpd" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.331837 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="proxy-httpd" Dec 09 04:59:30 crc kubenswrapper[4766]: E1209 04:59:30.331852 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="ceilometer-notification-agent" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.331860 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="ceilometer-notification-agent" Dec 09 04:59:30 crc kubenswrapper[4766]: E1209 04:59:30.331888 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="sg-core" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.331898 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="sg-core" Dec 09 04:59:30 crc kubenswrapper[4766]: E1209 04:59:30.331911 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dcac32-7e95-4025-8aee-a985b13b52ef" containerName="dnsmasq-dns" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.331921 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dcac32-7e95-4025-8aee-a985b13b52ef" containerName="dnsmasq-dns" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.332190 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="proxy-httpd" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.332251 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="sg-core" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.332278 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="ceilometer-central-agent" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.332306 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" containerName="ceilometer-notification-agent" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.332325 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dcac32-7e95-4025-8aee-a985b13b52ef" containerName="dnsmasq-dns" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.335047 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.338878 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.339149 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.342327 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.392084 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-scripts\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.392146 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.392294 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvzg\" (UniqueName: \"kubernetes.io/projected/86691648-ed71-473b-9f1a-51b0982e140a-kube-api-access-vtvzg\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.392318 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-config-data\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.392465 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.392516 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86691648-ed71-473b-9f1a-51b0982e140a-run-httpd\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.392702 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86691648-ed71-473b-9f1a-51b0982e140a-log-httpd\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.494377 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.494480 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvzg\" (UniqueName: \"kubernetes.io/projected/86691648-ed71-473b-9f1a-51b0982e140a-kube-api-access-vtvzg\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.494513 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-config-data\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.494607 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.494642 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86691648-ed71-473b-9f1a-51b0982e140a-run-httpd\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.494662 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86691648-ed71-473b-9f1a-51b0982e140a-log-httpd\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.494727 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-scripts\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.495828 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86691648-ed71-473b-9f1a-51b0982e140a-run-httpd\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.496136 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86691648-ed71-473b-9f1a-51b0982e140a-log-httpd\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.498742 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-scripts\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.499076 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.499939 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-config-data\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.500009 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86691648-ed71-473b-9f1a-51b0982e140a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.521408 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvzg\" (UniqueName: \"kubernetes.io/projected/86691648-ed71-473b-9f1a-51b0982e140a-kube-api-access-vtvzg\") pod \"ceilometer-0\" (UID: \"86691648-ed71-473b-9f1a-51b0982e140a\") " pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.657624 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 09 04:59:30 crc kubenswrapper[4766]: I1209 04:59:30.854525 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc6adaf4-b846-4113-a338-1bfeccca7a82" path="/var/lib/kubelet/pods/fc6adaf4-b846-4113-a338-1bfeccca7a82/volumes" Dec 09 04:59:31 crc kubenswrapper[4766]: I1209 04:59:31.135749 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 09 04:59:31 crc kubenswrapper[4766]: W1209 04:59:31.140012 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86691648_ed71_473b_9f1a_51b0982e140a.slice/crio-9b407a336b93ff15534e691326c39a7d9c7223009f4cfcf7bc3ca40ca2385b21 WatchSource:0}: Error finding container 9b407a336b93ff15534e691326c39a7d9c7223009f4cfcf7bc3ca40ca2385b21: Status 404 returned error can't find the container with id 9b407a336b93ff15534e691326c39a7d9c7223009f4cfcf7bc3ca40ca2385b21 Dec 09 04:59:31 crc kubenswrapper[4766]: I1209 04:59:31.289179 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86691648-ed71-473b-9f1a-51b0982e140a","Type":"ContainerStarted","Data":"9b407a336b93ff15534e691326c39a7d9c7223009f4cfcf7bc3ca40ca2385b21"} Dec 09 04:59:31 crc kubenswrapper[4766]: I1209 04:59:31.840748 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:59:31 crc kubenswrapper[4766]: E1209 04:59:31.841466 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:59:32 crc kubenswrapper[4766]: I1209 04:59:32.301289 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86691648-ed71-473b-9f1a-51b0982e140a","Type":"ContainerStarted","Data":"84a02866b882038205b6be098cd905ad938882f5052715cb05b33ba707b52cc0"} Dec 09 04:59:33 crc kubenswrapper[4766]: I1209 04:59:33.314451 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86691648-ed71-473b-9f1a-51b0982e140a","Type":"ContainerStarted","Data":"36a1fe26c3b1ff22fdbe235654bf7a02c106480629902d1d72a78e437fb1c7e1"} Dec 09 04:59:33 crc kubenswrapper[4766]: I1209 04:59:33.315072 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86691648-ed71-473b-9f1a-51b0982e140a","Type":"ContainerStarted","Data":"c75e3383448fc1413f897fb0a490eb5962119235677c3227c691ad163678b087"} Dec 09 04:59:35 crc kubenswrapper[4766]: I1209 04:59:35.341466 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86691648-ed71-473b-9f1a-51b0982e140a","Type":"ContainerStarted","Data":"fb4eb79925a29095ea9f8bdea067bcc32d4c64951b5bc15187ed5d1543342632"} Dec 09 04:59:35 crc kubenswrapper[4766]: I1209 04:59:35.344199 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 09 04:59:35 crc kubenswrapper[4766]: I1209 04:59:35.386095 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.957366478 podStartE2EDuration="5.386072325s" podCreationTimestamp="2025-12-09 04:59:30 +0000 UTC" firstStartedPulling="2025-12-09 04:59:31.144456125 +0000 UTC m=+6452.853761551" lastFinishedPulling="2025-12-09 04:59:34.573161962 +0000 UTC m=+6456.282467398" observedRunningTime="2025-12-09 04:59:35.372500858 +0000 UTC m=+6457.081806304" watchObservedRunningTime="2025-12-09 04:59:35.386072325 +0000 UTC m=+6457.095377791" Dec 09 04:59:38 crc kubenswrapper[4766]: I1209 04:59:38.506953 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 09 04:59:38 crc kubenswrapper[4766]: I1209 04:59:38.616820 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 09 04:59:38 crc kubenswrapper[4766]: I1209 04:59:38.951497 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 09 04:59:39 crc kubenswrapper[4766]: I1209 04:59:39.037730 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qxppj"] Dec 09 04:59:39 crc kubenswrapper[4766]: I1209 04:59:39.053477 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qxppj"] Dec 09 04:59:40 crc kubenswrapper[4766]: I1209 04:59:40.051596 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a499-account-create-update-gm95l"] Dec 09 04:59:40 crc kubenswrapper[4766]: I1209 04:59:40.079110 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a499-account-create-update-gm95l"] Dec 09 04:59:40 crc kubenswrapper[4766]: I1209 04:59:40.856943 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623" path="/var/lib/kubelet/pods/8fd1d7ad-2732-46d8-85ad-8a8b5e7a7623/volumes" Dec 09 04:59:40 crc kubenswrapper[4766]: I1209 04:59:40.858300 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbbece1b-ed34-4636-b968-3adf08d1becb" path="/var/lib/kubelet/pods/fbbece1b-ed34-4636-b968-3adf08d1becb/volumes" Dec 09 04:59:42 crc kubenswrapper[4766]: I1209 04:59:42.839933 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:59:42 crc kubenswrapper[4766]: E1209 04:59:42.840567 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 04:59:47 crc kubenswrapper[4766]: I1209 04:59:47.038430 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xlvns"] Dec 09 04:59:47 crc kubenswrapper[4766]: I1209 04:59:47.050165 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xlvns"] Dec 09 04:59:48 crc kubenswrapper[4766]: I1209 04:59:48.853701 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e4ae20-163f-4d0a-a59a-4c4e79ec7129" path="/var/lib/kubelet/pods/e6e4ae20-163f-4d0a-a59a-4c4e79ec7129/volumes" Dec 09 04:59:54 crc kubenswrapper[4766]: I1209 04:59:54.840190 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 04:59:54 crc kubenswrapper[4766]: E1209 04:59:54.841295 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.168901 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x"] Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.171741 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.177356 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.182160 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.183913 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x"] Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.305712 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07051b6a-932b-405d-a242-6415f1b28e94-config-volume\") pod \"collect-profiles-29420940-wt72x\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.306059 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj98w\" (UniqueName: \"kubernetes.io/projected/07051b6a-932b-405d-a242-6415f1b28e94-kube-api-access-jj98w\") pod \"collect-profiles-29420940-wt72x\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.306230 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07051b6a-932b-405d-a242-6415f1b28e94-secret-volume\") pod \"collect-profiles-29420940-wt72x\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.407942 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj98w\" (UniqueName: \"kubernetes.io/projected/07051b6a-932b-405d-a242-6415f1b28e94-kube-api-access-jj98w\") pod \"collect-profiles-29420940-wt72x\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.408006 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07051b6a-932b-405d-a242-6415f1b28e94-secret-volume\") pod \"collect-profiles-29420940-wt72x\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.408076 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07051b6a-932b-405d-a242-6415f1b28e94-config-volume\") pod \"collect-profiles-29420940-wt72x\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.409109 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07051b6a-932b-405d-a242-6415f1b28e94-config-volume\") pod \"collect-profiles-29420940-wt72x\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.414936 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07051b6a-932b-405d-a242-6415f1b28e94-secret-volume\") pod \"collect-profiles-29420940-wt72x\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.438731 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj98w\" (UniqueName: \"kubernetes.io/projected/07051b6a-932b-405d-a242-6415f1b28e94-kube-api-access-jj98w\") pod \"collect-profiles-29420940-wt72x\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.504158 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:00 crc kubenswrapper[4766]: I1209 05:00:00.673640 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 09 05:00:01 crc kubenswrapper[4766]: I1209 05:00:01.095459 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x"] Dec 09 05:00:01 crc kubenswrapper[4766]: I1209 05:00:01.672871 4766 generic.go:334] "Generic (PLEG): container finished" podID="07051b6a-932b-405d-a242-6415f1b28e94" containerID="984b6dc6c2b62272e6a701cb7b4b69bd6ae613f5a4475ff15a52c6430dd2fb92" exitCode=0 Dec 09 05:00:01 crc kubenswrapper[4766]: I1209 05:00:01.672923 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" event={"ID":"07051b6a-932b-405d-a242-6415f1b28e94","Type":"ContainerDied","Data":"984b6dc6c2b62272e6a701cb7b4b69bd6ae613f5a4475ff15a52c6430dd2fb92"} Dec 09 05:00:01 crc kubenswrapper[4766]: I1209 05:00:01.673368 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" event={"ID":"07051b6a-932b-405d-a242-6415f1b28e94","Type":"ContainerStarted","Data":"81c62bd2ec684476705cbb19ead448e22546dc05a36f1ba4b3a862ffdff79a42"} Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.132931 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.190413 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj98w\" (UniqueName: \"kubernetes.io/projected/07051b6a-932b-405d-a242-6415f1b28e94-kube-api-access-jj98w\") pod \"07051b6a-932b-405d-a242-6415f1b28e94\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.190508 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07051b6a-932b-405d-a242-6415f1b28e94-secret-volume\") pod \"07051b6a-932b-405d-a242-6415f1b28e94\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.190781 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07051b6a-932b-405d-a242-6415f1b28e94-config-volume\") pod \"07051b6a-932b-405d-a242-6415f1b28e94\" (UID: \"07051b6a-932b-405d-a242-6415f1b28e94\") " Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.191607 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07051b6a-932b-405d-a242-6415f1b28e94-config-volume" (OuterVolumeSpecName: "config-volume") pod "07051b6a-932b-405d-a242-6415f1b28e94" (UID: "07051b6a-932b-405d-a242-6415f1b28e94"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.198336 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07051b6a-932b-405d-a242-6415f1b28e94-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "07051b6a-932b-405d-a242-6415f1b28e94" (UID: "07051b6a-932b-405d-a242-6415f1b28e94"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.201153 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07051b6a-932b-405d-a242-6415f1b28e94-kube-api-access-jj98w" (OuterVolumeSpecName: "kube-api-access-jj98w") pod "07051b6a-932b-405d-a242-6415f1b28e94" (UID: "07051b6a-932b-405d-a242-6415f1b28e94"). InnerVolumeSpecName "kube-api-access-jj98w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.293701 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07051b6a-932b-405d-a242-6415f1b28e94-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.293736 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj98w\" (UniqueName: \"kubernetes.io/projected/07051b6a-932b-405d-a242-6415f1b28e94-kube-api-access-jj98w\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.293748 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07051b6a-932b-405d-a242-6415f1b28e94-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.696529 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" event={"ID":"07051b6a-932b-405d-a242-6415f1b28e94","Type":"ContainerDied","Data":"81c62bd2ec684476705cbb19ead448e22546dc05a36f1ba4b3a862ffdff79a42"} Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.696837 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81c62bd2ec684476705cbb19ead448e22546dc05a36f1ba4b3a862ffdff79a42" Dec 09 05:00:03 crc kubenswrapper[4766]: I1209 05:00:03.696588 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x" Dec 09 05:00:04 crc kubenswrapper[4766]: I1209 05:00:04.216573 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn"] Dec 09 05:00:04 crc kubenswrapper[4766]: I1209 05:00:04.228299 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420895-76wnn"] Dec 09 05:00:04 crc kubenswrapper[4766]: I1209 05:00:04.856865 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be850a52-d57f-4c18-a9b2-209ad7879827" path="/var/lib/kubelet/pods/be850a52-d57f-4c18-a9b2-209ad7879827/volumes" Dec 09 05:00:09 crc kubenswrapper[4766]: I1209 05:00:09.839021 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:00:09 crc kubenswrapper[4766]: E1209 05:00:09.839892 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:00:20 crc kubenswrapper[4766]: I1209 05:00:20.840668 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:00:20 crc kubenswrapper[4766]: E1209 05:00:20.841957 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:00:24 crc kubenswrapper[4766]: I1209 05:00:24.445470 4766 scope.go:117] "RemoveContainer" containerID="49abd4c8b1442e1413c221ae0f5d36bcb97d7353ebf0e0353c54876f4eb71e1d" Dec 09 05:00:24 crc kubenswrapper[4766]: I1209 05:00:24.478691 4766 scope.go:117] "RemoveContainer" containerID="d5ba375f9aba665c430573120c223f61cd6a72535d41e263a56733cd92f27570" Dec 09 05:00:24 crc kubenswrapper[4766]: I1209 05:00:24.534145 4766 scope.go:117] "RemoveContainer" containerID="aaed8c6d78317aa2aaad76b76827c34eadecef7ee458f511dd79f83afd19990d" Dec 09 05:00:24 crc kubenswrapper[4766]: I1209 05:00:24.579124 4766 scope.go:117] "RemoveContainer" containerID="c391c1b015598de9f6366dc22be451a03dcab1f5e6dba611fb7c34df1d269a44" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.894364 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54675c5b47-vxxgx"] Dec 09 05:00:27 crc kubenswrapper[4766]: E1209 05:00:27.895447 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07051b6a-932b-405d-a242-6415f1b28e94" containerName="collect-profiles" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.895467 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="07051b6a-932b-405d-a242-6415f1b28e94" containerName="collect-profiles" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.895719 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="07051b6a-932b-405d-a242-6415f1b28e94" containerName="collect-profiles" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.896864 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.898629 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.908524 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54675c5b47-vxxgx"] Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.994248 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-config\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.994303 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-sb\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.994326 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-dns-svc\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.994341 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdsks\" (UniqueName: \"kubernetes.io/projected/8a96a068-ae40-41b7-b734-8bbf7d33126a-kube-api-access-cdsks\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.994635 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-openstack-cell1\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:27 crc kubenswrapper[4766]: I1209 05:00:27.994783 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-nb\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.097277 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-openstack-cell1\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.097396 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-nb\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.097528 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-config\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.097564 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-sb\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.097592 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-dns-svc\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.097716 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdsks\" (UniqueName: \"kubernetes.io/projected/8a96a068-ae40-41b7-b734-8bbf7d33126a-kube-api-access-cdsks\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.098711 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-dns-svc\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.098730 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-nb\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.098800 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-config\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.098831 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-sb\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.099157 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-openstack-cell1\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.132308 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdsks\" (UniqueName: \"kubernetes.io/projected/8a96a068-ae40-41b7-b734-8bbf7d33126a-kube-api-access-cdsks\") pod \"dnsmasq-dns-54675c5b47-vxxgx\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.218855 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:28 crc kubenswrapper[4766]: I1209 05:00:28.750440 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54675c5b47-vxxgx"] Dec 09 05:00:29 crc kubenswrapper[4766]: I1209 05:00:29.032941 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" event={"ID":"8a96a068-ae40-41b7-b734-8bbf7d33126a","Type":"ContainerStarted","Data":"f289870637a54985d05da259e81220ed64207b131479bde0911719f361e6671c"} Dec 09 05:00:30 crc kubenswrapper[4766]: I1209 05:00:30.060143 4766 generic.go:334] "Generic (PLEG): container finished" podID="8a96a068-ae40-41b7-b734-8bbf7d33126a" containerID="14467922eec6ab175fa2754214590c8647e9604b582181348b4cc1b6372ee447" exitCode=0 Dec 09 05:00:30 crc kubenswrapper[4766]: I1209 05:00:30.060227 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" event={"ID":"8a96a068-ae40-41b7-b734-8bbf7d33126a","Type":"ContainerDied","Data":"14467922eec6ab175fa2754214590c8647e9604b582181348b4cc1b6372ee447"} Dec 09 05:00:31 crc kubenswrapper[4766]: I1209 05:00:31.076308 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" event={"ID":"8a96a068-ae40-41b7-b734-8bbf7d33126a","Type":"ContainerStarted","Data":"924aeaf6b7236eca9506a0f744d6da380a8e1109189c8b8ca831d50b995abe75"} Dec 09 05:00:31 crc kubenswrapper[4766]: I1209 05:00:31.076611 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:31 crc kubenswrapper[4766]: I1209 05:00:31.104928 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" podStartSLOduration=4.104909392 podStartE2EDuration="4.104909392s" podCreationTimestamp="2025-12-09 05:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 05:00:31.101364796 +0000 UTC m=+6512.810670232" watchObservedRunningTime="2025-12-09 05:00:31.104909392 +0000 UTC m=+6512.814214818" Dec 09 05:00:34 crc kubenswrapper[4766]: I1209 05:00:34.839737 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:00:34 crc kubenswrapper[4766]: E1209 05:00:34.840611 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.221600 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.316096 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-574fd486b5-dr8gv"] Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.316376 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" podUID="5a45af36-95fe-48bb-9711-fbe9f418eaa4" containerName="dnsmasq-dns" containerID="cri-o://eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2" gracePeriod=10 Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.562290 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cdbcddd87-rz29w"] Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.567365 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.581434 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cdbcddd87-rz29w"] Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.640554 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqmq\" (UniqueName: \"kubernetes.io/projected/973b0c8d-4740-4361-99a9-aeb033dd98f5-kube-api-access-thqmq\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.641173 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-ovsdbserver-sb\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.641525 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-config\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.642628 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-ovsdbserver-nb\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.644475 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-dns-svc\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.644644 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-openstack-cell1\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.747031 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-dns-svc\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.747389 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-openstack-cell1\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.747457 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thqmq\" (UniqueName: \"kubernetes.io/projected/973b0c8d-4740-4361-99a9-aeb033dd98f5-kube-api-access-thqmq\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.747585 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-ovsdbserver-sb\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.747605 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-config\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.747621 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-ovsdbserver-nb\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.747967 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-dns-svc\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.748441 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-openstack-cell1\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.748513 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-ovsdbserver-nb\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.748619 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-ovsdbserver-sb\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.748822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/973b0c8d-4740-4361-99a9-aeb033dd98f5-config\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.797868 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqmq\" (UniqueName: \"kubernetes.io/projected/973b0c8d-4740-4361-99a9-aeb033dd98f5-kube-api-access-thqmq\") pod \"dnsmasq-dns-6cdbcddd87-rz29w\" (UID: \"973b0c8d-4740-4361-99a9-aeb033dd98f5\") " pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:38 crc kubenswrapper[4766]: I1209 05:00:38.889374 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.048183 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.159920 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-sb\") pod \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.160019 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-nb\") pod \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.160075 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-dns-svc\") pod \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.160960 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4sn8\" (UniqueName: \"kubernetes.io/projected/5a45af36-95fe-48bb-9711-fbe9f418eaa4-kube-api-access-r4sn8\") pod \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.161016 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-config\") pod \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\" (UID: \"5a45af36-95fe-48bb-9711-fbe9f418eaa4\") " Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.181108 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a45af36-95fe-48bb-9711-fbe9f418eaa4-kube-api-access-r4sn8" (OuterVolumeSpecName: "kube-api-access-r4sn8") pod "5a45af36-95fe-48bb-9711-fbe9f418eaa4" (UID: "5a45af36-95fe-48bb-9711-fbe9f418eaa4"). InnerVolumeSpecName "kube-api-access-r4sn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.185086 4766 generic.go:334] "Generic (PLEG): container finished" podID="5a45af36-95fe-48bb-9711-fbe9f418eaa4" containerID="eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2" exitCode=0 Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.185297 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" event={"ID":"5a45af36-95fe-48bb-9711-fbe9f418eaa4","Type":"ContainerDied","Data":"eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2"} Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.185561 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" event={"ID":"5a45af36-95fe-48bb-9711-fbe9f418eaa4","Type":"ContainerDied","Data":"33f533f2ebe2eb1c391fc368c3b07b6f3d1953a621cadf36c52dae915cfa1ada"} Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.185633 4766 scope.go:117] "RemoveContainer" containerID="eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.185385 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574fd486b5-dr8gv" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.223510 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5a45af36-95fe-48bb-9711-fbe9f418eaa4" (UID: "5a45af36-95fe-48bb-9711-fbe9f418eaa4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.223523 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a45af36-95fe-48bb-9711-fbe9f418eaa4" (UID: "5a45af36-95fe-48bb-9711-fbe9f418eaa4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.231786 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-config" (OuterVolumeSpecName: "config") pod "5a45af36-95fe-48bb-9711-fbe9f418eaa4" (UID: "5a45af36-95fe-48bb-9711-fbe9f418eaa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.234435 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5a45af36-95fe-48bb-9711-fbe9f418eaa4" (UID: "5a45af36-95fe-48bb-9711-fbe9f418eaa4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.258611 4766 scope.go:117] "RemoveContainer" containerID="78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.263130 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.263164 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.263176 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.263187 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4sn8\" (UniqueName: \"kubernetes.io/projected/5a45af36-95fe-48bb-9711-fbe9f418eaa4-kube-api-access-r4sn8\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.263197 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a45af36-95fe-48bb-9711-fbe9f418eaa4-config\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.284047 4766 scope.go:117] "RemoveContainer" containerID="eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2" Dec 09 05:00:39 crc kubenswrapper[4766]: E1209 05:00:39.284425 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2\": container with ID starting with eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2 not found: ID does not exist" containerID="eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.284460 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2"} err="failed to get container status \"eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2\": rpc error: code = NotFound desc = could not find container \"eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2\": container with ID starting with eac2d04a778fb0b83080b02a3583360b2285a6b1b4c23c26497399c0c5acd9f2 not found: ID does not exist" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.284480 4766 scope.go:117] "RemoveContainer" containerID="78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f" Dec 09 05:00:39 crc kubenswrapper[4766]: E1209 05:00:39.284813 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f\": container with ID starting with 78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f not found: ID does not exist" containerID="78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.284841 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f"} err="failed to get container status \"78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f\": rpc error: code = NotFound desc = could not find container \"78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f\": container with ID starting with 78b749e18d0bf489db132212c3d0643cb6ef5620a832c09bd10dfbcedb66de2f not found: ID does not exist" Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.436605 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cdbcddd87-rz29w"] Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.638858 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-574fd486b5-dr8gv"] Dec 09 05:00:39 crc kubenswrapper[4766]: I1209 05:00:39.651994 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-574fd486b5-dr8gv"] Dec 09 05:00:40 crc kubenswrapper[4766]: I1209 05:00:40.196495 4766 generic.go:334] "Generic (PLEG): container finished" podID="973b0c8d-4740-4361-99a9-aeb033dd98f5" containerID="debb609855dc4318d01e4ebff799608bc8ec8dc4c98b0bfa128b915f16e706e2" exitCode=0 Dec 09 05:00:40 crc kubenswrapper[4766]: I1209 05:00:40.196531 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" event={"ID":"973b0c8d-4740-4361-99a9-aeb033dd98f5","Type":"ContainerDied","Data":"debb609855dc4318d01e4ebff799608bc8ec8dc4c98b0bfa128b915f16e706e2"} Dec 09 05:00:40 crc kubenswrapper[4766]: I1209 05:00:40.196553 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" event={"ID":"973b0c8d-4740-4361-99a9-aeb033dd98f5","Type":"ContainerStarted","Data":"3aacfa0aabb352a32e6b9b9ccb989e453d97f556a8c5a54cdb4dcd2d32bf2a0a"} Dec 09 05:00:40 crc kubenswrapper[4766]: I1209 05:00:40.853858 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a45af36-95fe-48bb-9711-fbe9f418eaa4" path="/var/lib/kubelet/pods/5a45af36-95fe-48bb-9711-fbe9f418eaa4/volumes" Dec 09 05:00:41 crc kubenswrapper[4766]: I1209 05:00:41.209243 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" event={"ID":"973b0c8d-4740-4361-99a9-aeb033dd98f5","Type":"ContainerStarted","Data":"4a3b2571fe9f90c2a18fb159af7d81c7282ed0dee48e34f3cf493c5063504f81"} Dec 09 05:00:41 crc kubenswrapper[4766]: I1209 05:00:41.209597 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:41 crc kubenswrapper[4766]: I1209 05:00:41.246084 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" podStartSLOduration=3.246055782 podStartE2EDuration="3.246055782s" podCreationTimestamp="2025-12-09 05:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 05:00:41.230510812 +0000 UTC m=+6522.939816258" watchObservedRunningTime="2025-12-09 05:00:41.246055782 +0000 UTC m=+6522.955361248" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.498995 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b"] Dec 09 05:00:45 crc kubenswrapper[4766]: E1209 05:00:45.499832 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a45af36-95fe-48bb-9711-fbe9f418eaa4" containerName="init" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.499845 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a45af36-95fe-48bb-9711-fbe9f418eaa4" containerName="init" Dec 09 05:00:45 crc kubenswrapper[4766]: E1209 05:00:45.499877 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a45af36-95fe-48bb-9711-fbe9f418eaa4" containerName="dnsmasq-dns" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.499883 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a45af36-95fe-48bb-9711-fbe9f418eaa4" containerName="dnsmasq-dns" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.500138 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a45af36-95fe-48bb-9711-fbe9f418eaa4" containerName="dnsmasq-dns" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.500879 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.505837 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.505846 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.506049 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.506110 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.527116 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b"] Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.621864 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.621933 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tjjc\" (UniqueName: \"kubernetes.io/projected/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-kube-api-access-2tjjc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.621969 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.622059 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.622736 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.725718 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.725820 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tjjc\" (UniqueName: \"kubernetes.io/projected/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-kube-api-access-2tjjc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.725869 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.725944 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.726062 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.735079 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.735244 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.736592 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.742330 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.771342 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tjjc\" (UniqueName: \"kubernetes.io/projected/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-kube-api-access-2tjjc\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cs886b\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:45 crc kubenswrapper[4766]: I1209 05:00:45.829472 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:00:46 crc kubenswrapper[4766]: I1209 05:00:46.522258 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b"] Dec 09 05:00:46 crc kubenswrapper[4766]: W1209 05:00:46.524672 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90e77954_e1ed_4e5d_97c1_bdd8e94841b7.slice/crio-5e802ed54b73c94caa3c17951aad8ad1f83c2ea0cc63eaa18ee14dabb400aa39 WatchSource:0}: Error finding container 5e802ed54b73c94caa3c17951aad8ad1f83c2ea0cc63eaa18ee14dabb400aa39: Status 404 returned error can't find the container with id 5e802ed54b73c94caa3c17951aad8ad1f83c2ea0cc63eaa18ee14dabb400aa39 Dec 09 05:00:46 crc kubenswrapper[4766]: I1209 05:00:46.528910 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:00:47 crc kubenswrapper[4766]: I1209 05:00:47.286789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" event={"ID":"90e77954-e1ed-4e5d-97c1-bdd8e94841b7","Type":"ContainerStarted","Data":"5e802ed54b73c94caa3c17951aad8ad1f83c2ea0cc63eaa18ee14dabb400aa39"} Dec 09 05:00:47 crc kubenswrapper[4766]: I1209 05:00:47.840678 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:00:47 crc kubenswrapper[4766]: E1209 05:00:47.841313 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:00:48 crc kubenswrapper[4766]: I1209 05:00:48.895184 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cdbcddd87-rz29w" Dec 09 05:00:48 crc kubenswrapper[4766]: I1209 05:00:48.962379 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54675c5b47-vxxgx"] Dec 09 05:00:48 crc kubenswrapper[4766]: I1209 05:00:48.962633 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" podUID="8a96a068-ae40-41b7-b734-8bbf7d33126a" containerName="dnsmasq-dns" containerID="cri-o://924aeaf6b7236eca9506a0f744d6da380a8e1109189c8b8ca831d50b995abe75" gracePeriod=10 Dec 09 05:00:49 crc kubenswrapper[4766]: I1209 05:00:49.308832 4766 generic.go:334] "Generic (PLEG): container finished" podID="8a96a068-ae40-41b7-b734-8bbf7d33126a" containerID="924aeaf6b7236eca9506a0f744d6da380a8e1109189c8b8ca831d50b995abe75" exitCode=0 Dec 09 05:00:49 crc kubenswrapper[4766]: I1209 05:00:49.308898 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" event={"ID":"8a96a068-ae40-41b7-b734-8bbf7d33126a","Type":"ContainerDied","Data":"924aeaf6b7236eca9506a0f744d6da380a8e1109189c8b8ca831d50b995abe75"} Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.573181 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.712589 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-sb\") pod \"8a96a068-ae40-41b7-b734-8bbf7d33126a\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.712640 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-openstack-cell1\") pod \"8a96a068-ae40-41b7-b734-8bbf7d33126a\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.712746 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-dns-svc\") pod \"8a96a068-ae40-41b7-b734-8bbf7d33126a\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.712849 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-nb\") pod \"8a96a068-ae40-41b7-b734-8bbf7d33126a\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.712902 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdsks\" (UniqueName: \"kubernetes.io/projected/8a96a068-ae40-41b7-b734-8bbf7d33126a-kube-api-access-cdsks\") pod \"8a96a068-ae40-41b7-b734-8bbf7d33126a\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.712924 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-config\") pod \"8a96a068-ae40-41b7-b734-8bbf7d33126a\" (UID: \"8a96a068-ae40-41b7-b734-8bbf7d33126a\") " Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.717910 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a96a068-ae40-41b7-b734-8bbf7d33126a-kube-api-access-cdsks" (OuterVolumeSpecName: "kube-api-access-cdsks") pod "8a96a068-ae40-41b7-b734-8bbf7d33126a" (UID: "8a96a068-ae40-41b7-b734-8bbf7d33126a"). InnerVolumeSpecName "kube-api-access-cdsks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.766410 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a96a068-ae40-41b7-b734-8bbf7d33126a" (UID: "8a96a068-ae40-41b7-b734-8bbf7d33126a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.769522 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a96a068-ae40-41b7-b734-8bbf7d33126a" (UID: "8a96a068-ae40-41b7-b734-8bbf7d33126a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.770741 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "8a96a068-ae40-41b7-b734-8bbf7d33126a" (UID: "8a96a068-ae40-41b7-b734-8bbf7d33126a"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.771838 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a96a068-ae40-41b7-b734-8bbf7d33126a" (UID: "8a96a068-ae40-41b7-b734-8bbf7d33126a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.791778 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-config" (OuterVolumeSpecName: "config") pod "8a96a068-ae40-41b7-b734-8bbf7d33126a" (UID: "8a96a068-ae40-41b7-b734-8bbf7d33126a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.815410 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.815434 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdsks\" (UniqueName: \"kubernetes.io/projected/8a96a068-ae40-41b7-b734-8bbf7d33126a-kube-api-access-cdsks\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.815446 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-config\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.815454 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.815462 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:51 crc kubenswrapper[4766]: I1209 05:00:51.815472 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a96a068-ae40-41b7-b734-8bbf7d33126a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 09 05:00:52 crc kubenswrapper[4766]: I1209 05:00:52.340084 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" event={"ID":"8a96a068-ae40-41b7-b734-8bbf7d33126a","Type":"ContainerDied","Data":"f289870637a54985d05da259e81220ed64207b131479bde0911719f361e6671c"} Dec 09 05:00:52 crc kubenswrapper[4766]: I1209 05:00:52.340380 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54675c5b47-vxxgx" Dec 09 05:00:52 crc kubenswrapper[4766]: I1209 05:00:52.340913 4766 scope.go:117] "RemoveContainer" containerID="924aeaf6b7236eca9506a0f744d6da380a8e1109189c8b8ca831d50b995abe75" Dec 09 05:00:52 crc kubenswrapper[4766]: I1209 05:00:52.386073 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54675c5b47-vxxgx"] Dec 09 05:00:52 crc kubenswrapper[4766]: I1209 05:00:52.391312 4766 scope.go:117] "RemoveContainer" containerID="14467922eec6ab175fa2754214590c8647e9604b582181348b4cc1b6372ee447" Dec 09 05:00:52 crc kubenswrapper[4766]: I1209 05:00:52.407569 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54675c5b47-vxxgx"] Dec 09 05:00:52 crc kubenswrapper[4766]: I1209 05:00:52.854396 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a96a068-ae40-41b7-b734-8bbf7d33126a" path="/var/lib/kubelet/pods/8a96a068-ae40-41b7-b734-8bbf7d33126a/volumes" Dec 09 05:00:59 crc kubenswrapper[4766]: I1209 05:00:59.391147 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.149568 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29420941-cct7p"] Dec 09 05:01:00 crc kubenswrapper[4766]: E1209 05:01:00.150667 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a96a068-ae40-41b7-b734-8bbf7d33126a" containerName="dnsmasq-dns" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.150699 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a96a068-ae40-41b7-b734-8bbf7d33126a" containerName="dnsmasq-dns" Dec 09 05:01:00 crc kubenswrapper[4766]: E1209 05:01:00.150783 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a96a068-ae40-41b7-b734-8bbf7d33126a" containerName="init" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.150797 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a96a068-ae40-41b7-b734-8bbf7d33126a" containerName="init" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.151150 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a96a068-ae40-41b7-b734-8bbf7d33126a" containerName="dnsmasq-dns" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.152181 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.199042 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29420941-cct7p"] Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.201086 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-combined-ca-bundle\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.201305 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v252\" (UniqueName: \"kubernetes.io/projected/612474e2-a120-42c9-b73d-b3bef7c80036-kube-api-access-7v252\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.201349 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-config-data\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.201378 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-fernet-keys\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.303797 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v252\" (UniqueName: \"kubernetes.io/projected/612474e2-a120-42c9-b73d-b3bef7c80036-kube-api-access-7v252\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.304067 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-config-data\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.304103 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-fernet-keys\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.304137 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-combined-ca-bundle\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.311424 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-config-data\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.311548 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-combined-ca-bundle\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.317743 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-fernet-keys\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.318331 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v252\" (UniqueName: \"kubernetes.io/projected/612474e2-a120-42c9-b73d-b3bef7c80036-kube-api-access-7v252\") pod \"keystone-cron-29420941-cct7p\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.445571 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" event={"ID":"90e77954-e1ed-4e5d-97c1-bdd8e94841b7","Type":"ContainerStarted","Data":"753764f2c113387dacfe32f6a1c9af653f253a0e4c8c46ecc537468355737c39"} Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.470989 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" podStartSLOduration=2.6121652810000002 podStartE2EDuration="15.470973381s" podCreationTimestamp="2025-12-09 05:00:45 +0000 UTC" firstStartedPulling="2025-12-09 05:00:46.528437705 +0000 UTC m=+6528.237743151" lastFinishedPulling="2025-12-09 05:00:59.387245785 +0000 UTC m=+6541.096551251" observedRunningTime="2025-12-09 05:01:00.465448452 +0000 UTC m=+6542.174753878" watchObservedRunningTime="2025-12-09 05:01:00.470973381 +0000 UTC m=+6542.180278807" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.480549 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:00 crc kubenswrapper[4766]: I1209 05:01:00.970313 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29420941-cct7p"] Dec 09 05:01:00 crc kubenswrapper[4766]: W1209 05:01:00.978473 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod612474e2_a120_42c9_b73d_b3bef7c80036.slice/crio-8ac4b53a1f214fba349da3fb1f59b8b7733e4ee1b780403a143e516380fe69a4 WatchSource:0}: Error finding container 8ac4b53a1f214fba349da3fb1f59b8b7733e4ee1b780403a143e516380fe69a4: Status 404 returned error can't find the container with id 8ac4b53a1f214fba349da3fb1f59b8b7733e4ee1b780403a143e516380fe69a4 Dec 09 05:01:01 crc kubenswrapper[4766]: I1209 05:01:01.456340 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29420941-cct7p" event={"ID":"612474e2-a120-42c9-b73d-b3bef7c80036","Type":"ContainerStarted","Data":"2eaa426f8525edeaa49168acfccb90ffb5262c1fb8e93141d64f13bcb4f19114"} Dec 09 05:01:01 crc kubenswrapper[4766]: I1209 05:01:01.456675 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29420941-cct7p" event={"ID":"612474e2-a120-42c9-b73d-b3bef7c80036","Type":"ContainerStarted","Data":"8ac4b53a1f214fba349da3fb1f59b8b7733e4ee1b780403a143e516380fe69a4"} Dec 09 05:01:01 crc kubenswrapper[4766]: I1209 05:01:01.475270 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29420941-cct7p" podStartSLOduration=1.475252249 podStartE2EDuration="1.475252249s" podCreationTimestamp="2025-12-09 05:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 05:01:01.470475479 +0000 UTC m=+6543.179780915" watchObservedRunningTime="2025-12-09 05:01:01.475252249 +0000 UTC m=+6543.184557675" Dec 09 05:01:01 crc kubenswrapper[4766]: I1209 05:01:01.839732 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:01:01 crc kubenswrapper[4766]: E1209 05:01:01.840288 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:01:04 crc kubenswrapper[4766]: I1209 05:01:04.503527 4766 generic.go:334] "Generic (PLEG): container finished" podID="612474e2-a120-42c9-b73d-b3bef7c80036" containerID="2eaa426f8525edeaa49168acfccb90ffb5262c1fb8e93141d64f13bcb4f19114" exitCode=0 Dec 09 05:01:04 crc kubenswrapper[4766]: I1209 05:01:04.503631 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29420941-cct7p" event={"ID":"612474e2-a120-42c9-b73d-b3bef7c80036","Type":"ContainerDied","Data":"2eaa426f8525edeaa49168acfccb90ffb5262c1fb8e93141d64f13bcb4f19114"} Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.000354 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.173166 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v252\" (UniqueName: \"kubernetes.io/projected/612474e2-a120-42c9-b73d-b3bef7c80036-kube-api-access-7v252\") pod \"612474e2-a120-42c9-b73d-b3bef7c80036\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.173491 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-config-data\") pod \"612474e2-a120-42c9-b73d-b3bef7c80036\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.173552 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-combined-ca-bundle\") pod \"612474e2-a120-42c9-b73d-b3bef7c80036\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.173576 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-fernet-keys\") pod \"612474e2-a120-42c9-b73d-b3bef7c80036\" (UID: \"612474e2-a120-42c9-b73d-b3bef7c80036\") " Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.178341 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "612474e2-a120-42c9-b73d-b3bef7c80036" (UID: "612474e2-a120-42c9-b73d-b3bef7c80036"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.185554 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/612474e2-a120-42c9-b73d-b3bef7c80036-kube-api-access-7v252" (OuterVolumeSpecName: "kube-api-access-7v252") pod "612474e2-a120-42c9-b73d-b3bef7c80036" (UID: "612474e2-a120-42c9-b73d-b3bef7c80036"). InnerVolumeSpecName "kube-api-access-7v252". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.208848 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "612474e2-a120-42c9-b73d-b3bef7c80036" (UID: "612474e2-a120-42c9-b73d-b3bef7c80036"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.243506 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-config-data" (OuterVolumeSpecName: "config-data") pod "612474e2-a120-42c9-b73d-b3bef7c80036" (UID: "612474e2-a120-42c9-b73d-b3bef7c80036"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.276182 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.276246 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.276260 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v252\" (UniqueName: \"kubernetes.io/projected/612474e2-a120-42c9-b73d-b3bef7c80036-kube-api-access-7v252\") on node \"crc\" DevicePath \"\"" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.276273 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612474e2-a120-42c9-b73d-b3bef7c80036-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.536005 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29420941-cct7p" event={"ID":"612474e2-a120-42c9-b73d-b3bef7c80036","Type":"ContainerDied","Data":"8ac4b53a1f214fba349da3fb1f59b8b7733e4ee1b780403a143e516380fe69a4"} Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.536061 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ac4b53a1f214fba349da3fb1f59b8b7733e4ee1b780403a143e516380fe69a4" Dec 09 05:01:06 crc kubenswrapper[4766]: I1209 05:01:06.536104 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29420941-cct7p" Dec 09 05:01:14 crc kubenswrapper[4766]: I1209 05:01:14.642458 4766 generic.go:334] "Generic (PLEG): container finished" podID="90e77954-e1ed-4e5d-97c1-bdd8e94841b7" containerID="753764f2c113387dacfe32f6a1c9af653f253a0e4c8c46ecc537468355737c39" exitCode=0 Dec 09 05:01:14 crc kubenswrapper[4766]: I1209 05:01:14.642714 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" event={"ID":"90e77954-e1ed-4e5d-97c1-bdd8e94841b7","Type":"ContainerDied","Data":"753764f2c113387dacfe32f6a1c9af653f253a0e4c8c46ecc537468355737c39"} Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.213741 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.323520 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-inventory\") pod \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.323565 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ceph\") pod \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.323613 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-pre-adoption-validation-combined-ca-bundle\") pod \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.323631 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tjjc\" (UniqueName: \"kubernetes.io/projected/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-kube-api-access-2tjjc\") pod \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.323706 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ssh-key\") pod \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\" (UID: \"90e77954-e1ed-4e5d-97c1-bdd8e94841b7\") " Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.329566 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-kube-api-access-2tjjc" (OuterVolumeSpecName: "kube-api-access-2tjjc") pod "90e77954-e1ed-4e5d-97c1-bdd8e94841b7" (UID: "90e77954-e1ed-4e5d-97c1-bdd8e94841b7"). InnerVolumeSpecName "kube-api-access-2tjjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.330142 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "90e77954-e1ed-4e5d-97c1-bdd8e94841b7" (UID: "90e77954-e1ed-4e5d-97c1-bdd8e94841b7"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.337522 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ceph" (OuterVolumeSpecName: "ceph") pod "90e77954-e1ed-4e5d-97c1-bdd8e94841b7" (UID: "90e77954-e1ed-4e5d-97c1-bdd8e94841b7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.362582 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-inventory" (OuterVolumeSpecName: "inventory") pod "90e77954-e1ed-4e5d-97c1-bdd8e94841b7" (UID: "90e77954-e1ed-4e5d-97c1-bdd8e94841b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.374998 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "90e77954-e1ed-4e5d-97c1-bdd8e94841b7" (UID: "90e77954-e1ed-4e5d-97c1-bdd8e94841b7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.434630 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.434677 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.434691 4766 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.434705 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tjjc\" (UniqueName: \"kubernetes.io/projected/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-kube-api-access-2tjjc\") on node \"crc\" DevicePath \"\"" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.434717 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90e77954-e1ed-4e5d-97c1-bdd8e94841b7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.665978 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" event={"ID":"90e77954-e1ed-4e5d-97c1-bdd8e94841b7","Type":"ContainerDied","Data":"5e802ed54b73c94caa3c17951aad8ad1f83c2ea0cc63eaa18ee14dabb400aa39"} Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.666384 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e802ed54b73c94caa3c17951aad8ad1f83c2ea0cc63eaa18ee14dabb400aa39" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.666078 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cs886b" Dec 09 05:01:16 crc kubenswrapper[4766]: I1209 05:01:16.839956 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:01:16 crc kubenswrapper[4766]: E1209 05:01:16.840522 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.452857 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q"] Dec 09 05:01:23 crc kubenswrapper[4766]: E1209 05:01:23.454350 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e77954-e1ed-4e5d-97c1-bdd8e94841b7" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.454390 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e77954-e1ed-4e5d-97c1-bdd8e94841b7" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 09 05:01:23 crc kubenswrapper[4766]: E1209 05:01:23.454442 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612474e2-a120-42c9-b73d-b3bef7c80036" containerName="keystone-cron" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.454458 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="612474e2-a120-42c9-b73d-b3bef7c80036" containerName="keystone-cron" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.454900 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="612474e2-a120-42c9-b73d-b3bef7c80036" containerName="keystone-cron" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.454955 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e77954-e1ed-4e5d-97c1-bdd8e94841b7" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.456305 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.460484 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.460944 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.461397 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.461972 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.470244 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q"] Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.491625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.491694 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dwrj\" (UniqueName: \"kubernetes.io/projected/b536acce-84ce-48e5-b47e-9c68a26c82a7-kube-api-access-5dwrj\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.491789 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.491914 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.491952 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.593659 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.593703 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dwrj\" (UniqueName: \"kubernetes.io/projected/b536acce-84ce-48e5-b47e-9c68a26c82a7-kube-api-access-5dwrj\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.593755 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.593815 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.593838 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.600160 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.600585 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.601380 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.607552 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.615791 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dwrj\" (UniqueName: \"kubernetes.io/projected/b536acce-84ce-48e5-b47e-9c68a26c82a7-kube-api-access-5dwrj\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:23 crc kubenswrapper[4766]: I1209 05:01:23.789897 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:01:24 crc kubenswrapper[4766]: I1209 05:01:24.442355 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q"] Dec 09 05:01:24 crc kubenswrapper[4766]: W1209 05:01:24.447322 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb536acce_84ce_48e5_b47e_9c68a26c82a7.slice/crio-596a49c07e6ff263549535c22d9893e609e8781e188c3a9057abdf4c99bbd125 WatchSource:0}: Error finding container 596a49c07e6ff263549535c22d9893e609e8781e188c3a9057abdf4c99bbd125: Status 404 returned error can't find the container with id 596a49c07e6ff263549535c22d9893e609e8781e188c3a9057abdf4c99bbd125 Dec 09 05:01:24 crc kubenswrapper[4766]: I1209 05:01:24.747613 4766 scope.go:117] "RemoveContainer" containerID="d193497fad6ecc979ed3ecb60088f51cc34576887bd67816a886011691eebf10" Dec 09 05:01:24 crc kubenswrapper[4766]: I1209 05:01:24.753572 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" event={"ID":"b536acce-84ce-48e5-b47e-9c68a26c82a7","Type":"ContainerStarted","Data":"596a49c07e6ff263549535c22d9893e609e8781e188c3a9057abdf4c99bbd125"} Dec 09 05:01:25 crc kubenswrapper[4766]: I1209 05:01:25.081173 4766 scope.go:117] "RemoveContainer" containerID="20007cb3bd525f6fb6b4c1b898c82340c508c9f7df88df0cd2e5c6192292d036" Dec 09 05:01:25 crc kubenswrapper[4766]: I1209 05:01:25.762074 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" event={"ID":"b536acce-84ce-48e5-b47e-9c68a26c82a7","Type":"ContainerStarted","Data":"0444f6f4c5d254330997098faf1247a4b82994a27409efa2213b1f0c15380315"} Dec 09 05:01:25 crc kubenswrapper[4766]: I1209 05:01:25.779236 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" podStartSLOduration=2.572144481 podStartE2EDuration="2.77919007s" podCreationTimestamp="2025-12-09 05:01:23 +0000 UTC" firstStartedPulling="2025-12-09 05:01:24.461121088 +0000 UTC m=+6566.170426524" lastFinishedPulling="2025-12-09 05:01:24.668166687 +0000 UTC m=+6566.377472113" observedRunningTime="2025-12-09 05:01:25.778331037 +0000 UTC m=+6567.487636483" watchObservedRunningTime="2025-12-09 05:01:25.77919007 +0000 UTC m=+6567.488495506" Dec 09 05:01:30 crc kubenswrapper[4766]: I1209 05:01:30.840286 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:01:30 crc kubenswrapper[4766]: E1209 05:01:30.841549 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:01:45 crc kubenswrapper[4766]: I1209 05:01:45.839202 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:01:45 crc kubenswrapper[4766]: E1209 05:01:45.839954 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:02:00 crc kubenswrapper[4766]: I1209 05:02:00.840095 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:02:00 crc kubenswrapper[4766]: E1209 05:02:00.841463 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:02:14 crc kubenswrapper[4766]: I1209 05:02:14.840141 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:02:14 crc kubenswrapper[4766]: E1209 05:02:14.840984 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:02:25 crc kubenswrapper[4766]: I1209 05:02:25.839508 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:02:25 crc kubenswrapper[4766]: E1209 05:02:25.840500 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:02:32 crc kubenswrapper[4766]: I1209 05:02:32.053190 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-2ckkq"] Dec 09 05:02:32 crc kubenswrapper[4766]: I1209 05:02:32.066979 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-2ckkq"] Dec 09 05:02:32 crc kubenswrapper[4766]: I1209 05:02:32.862758 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="155b15f7-da34-4348-a75d-07da640ab348" path="/var/lib/kubelet/pods/155b15f7-da34-4348-a75d-07da640ab348/volumes" Dec 09 05:02:34 crc kubenswrapper[4766]: I1209 05:02:34.029063 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-ee4a-account-create-update-j9clj"] Dec 09 05:02:34 crc kubenswrapper[4766]: I1209 05:02:34.042761 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-ee4a-account-create-update-j9clj"] Dec 09 05:02:34 crc kubenswrapper[4766]: I1209 05:02:34.855564 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d94708-9c8a-42d8-aa7c-4698518e046c" path="/var/lib/kubelet/pods/28d94708-9c8a-42d8-aa7c-4698518e046c/volumes" Dec 09 05:02:36 crc kubenswrapper[4766]: I1209 05:02:36.842818 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:02:36 crc kubenswrapper[4766]: E1209 05:02:36.843764 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:02:39 crc kubenswrapper[4766]: I1209 05:02:39.037105 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-grfpq"] Dec 09 05:02:39 crc kubenswrapper[4766]: I1209 05:02:39.049232 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-grfpq"] Dec 09 05:02:40 crc kubenswrapper[4766]: I1209 05:02:40.058449 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-bca3-account-create-update-rj692"] Dec 09 05:02:40 crc kubenswrapper[4766]: I1209 05:02:40.076311 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-bca3-account-create-update-rj692"] Dec 09 05:02:40 crc kubenswrapper[4766]: I1209 05:02:40.862616 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43010a7a-c276-4355-9ed5-4280ca859180" path="/var/lib/kubelet/pods/43010a7a-c276-4355-9ed5-4280ca859180/volumes" Dec 09 05:02:40 crc kubenswrapper[4766]: I1209 05:02:40.864954 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5621b85-b6d5-48c6-911e-a49541e1cdf5" path="/var/lib/kubelet/pods/c5621b85-b6d5-48c6-911e-a49541e1cdf5/volumes" Dec 09 05:02:49 crc kubenswrapper[4766]: I1209 05:02:49.839766 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:02:50 crc kubenswrapper[4766]: I1209 05:02:50.909197 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"d021bcfbc8a8df6c3805341afe498144b1ef33e23b7238e24e497487bc503d42"} Dec 09 05:03:09 crc kubenswrapper[4766]: I1209 05:03:09.042708 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-bkxsh"] Dec 09 05:03:09 crc kubenswrapper[4766]: I1209 05:03:09.051036 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-bkxsh"] Dec 09 05:03:10 crc kubenswrapper[4766]: I1209 05:03:10.866009 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="019c97bc-64bc-436e-b815-72462b2df991" path="/var/lib/kubelet/pods/019c97bc-64bc-436e-b815-72462b2df991/volumes" Dec 09 05:03:25 crc kubenswrapper[4766]: I1209 05:03:25.185420 4766 scope.go:117] "RemoveContainer" containerID="8dec284693e16bf24947ddd894429d7d4594dd7dcb76e47b160d5ab5fab39c46" Dec 09 05:03:25 crc kubenswrapper[4766]: I1209 05:03:25.241905 4766 scope.go:117] "RemoveContainer" containerID="bed691b430cf9b58e5f3ee65bb04ec9d4318a833b6ed67595404c0aa4283a3e7" Dec 09 05:03:25 crc kubenswrapper[4766]: I1209 05:03:25.282323 4766 scope.go:117] "RemoveContainer" containerID="37c36d6df7256898cd91261a5831c4e8d8aca76b323fbf27268333a26d339515" Dec 09 05:03:25 crc kubenswrapper[4766]: I1209 05:03:25.327833 4766 scope.go:117] "RemoveContainer" containerID="2414309a61531152f145b49721ab1afd5ad97f46d0d1e0f1cfbe9bfa8271a488" Dec 09 05:03:25 crc kubenswrapper[4766]: I1209 05:03:25.377851 4766 scope.go:117] "RemoveContainer" containerID="ffac8514a004196856a6fe0bd742c3c5106d8aeb565454cd48d7702de07fbf57" Dec 09 05:03:25 crc kubenswrapper[4766]: I1209 05:03:25.428934 4766 scope.go:117] "RemoveContainer" containerID="1e11d5613abc857449f471091ec107b112e673f92506fe7fa0ab1c374b298be6" Dec 09 05:05:07 crc kubenswrapper[4766]: I1209 05:05:07.316543 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:05:07 crc kubenswrapper[4766]: I1209 05:05:07.317084 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:05:37 crc kubenswrapper[4766]: I1209 05:05:37.316879 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:05:37 crc kubenswrapper[4766]: I1209 05:05:37.317387 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.676622 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l6ljn"] Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.679698 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.708119 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6ljn"] Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.794539 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-utilities\") pod \"certified-operators-l6ljn\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.794610 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-catalog-content\") pod \"certified-operators-l6ljn\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.794790 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcq62\" (UniqueName: \"kubernetes.io/projected/2869bea5-d149-40de-9c1e-ecefe14d7d7b-kube-api-access-pcq62\") pod \"certified-operators-l6ljn\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.896292 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcq62\" (UniqueName: \"kubernetes.io/projected/2869bea5-d149-40de-9c1e-ecefe14d7d7b-kube-api-access-pcq62\") pod \"certified-operators-l6ljn\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.896398 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-utilities\") pod \"certified-operators-l6ljn\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.896419 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-catalog-content\") pod \"certified-operators-l6ljn\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.896938 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-catalog-content\") pod \"certified-operators-l6ljn\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.897412 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-utilities\") pod \"certified-operators-l6ljn\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:45 crc kubenswrapper[4766]: I1209 05:05:45.931992 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcq62\" (UniqueName: \"kubernetes.io/projected/2869bea5-d149-40de-9c1e-ecefe14d7d7b-kube-api-access-pcq62\") pod \"certified-operators-l6ljn\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:46 crc kubenswrapper[4766]: I1209 05:05:46.010496 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:46 crc kubenswrapper[4766]: I1209 05:05:46.557463 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6ljn"] Dec 09 05:05:47 crc kubenswrapper[4766]: I1209 05:05:47.106695 4766 generic.go:334] "Generic (PLEG): container finished" podID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerID="98edf4c7872c55ef8b163eaedaf5c67ddc97ca8ff69111618bbf0bdb8e5e36fc" exitCode=0 Dec 09 05:05:47 crc kubenswrapper[4766]: I1209 05:05:47.106779 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6ljn" event={"ID":"2869bea5-d149-40de-9c1e-ecefe14d7d7b","Type":"ContainerDied","Data":"98edf4c7872c55ef8b163eaedaf5c67ddc97ca8ff69111618bbf0bdb8e5e36fc"} Dec 09 05:05:47 crc kubenswrapper[4766]: I1209 05:05:47.107062 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6ljn" event={"ID":"2869bea5-d149-40de-9c1e-ecefe14d7d7b","Type":"ContainerStarted","Data":"5331d91f5afc8d9f7bdc3e07875a46f184ddf6ad6c0b3336c6c5d1938fb857b2"} Dec 09 05:05:47 crc kubenswrapper[4766]: I1209 05:05:47.108866 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.064838 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nts28"] Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.067242 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.086163 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nts28"] Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.123175 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6ljn" event={"ID":"2869bea5-d149-40de-9c1e-ecefe14d7d7b","Type":"ContainerStarted","Data":"a557dfdca0cb4ae0fe94d90dd1dc16fbc6932cf8e66c44e50f2b3fd103e4fc72"} Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.152904 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a543154-2164-4e7d-ac52-9ee7420ff8ba-catalog-content\") pod \"redhat-operators-nts28\" (UID: \"1a543154-2164-4e7d-ac52-9ee7420ff8ba\") " pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.152994 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a543154-2164-4e7d-ac52-9ee7420ff8ba-utilities\") pod \"redhat-operators-nts28\" (UID: \"1a543154-2164-4e7d-ac52-9ee7420ff8ba\") " pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.153049 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d4pz\" (UniqueName: \"kubernetes.io/projected/1a543154-2164-4e7d-ac52-9ee7420ff8ba-kube-api-access-2d4pz\") pod \"redhat-operators-nts28\" (UID: \"1a543154-2164-4e7d-ac52-9ee7420ff8ba\") " pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.255441 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a543154-2164-4e7d-ac52-9ee7420ff8ba-catalog-content\") pod \"redhat-operators-nts28\" (UID: \"1a543154-2164-4e7d-ac52-9ee7420ff8ba\") " pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.255511 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a543154-2164-4e7d-ac52-9ee7420ff8ba-utilities\") pod \"redhat-operators-nts28\" (UID: \"1a543154-2164-4e7d-ac52-9ee7420ff8ba\") " pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.255558 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d4pz\" (UniqueName: \"kubernetes.io/projected/1a543154-2164-4e7d-ac52-9ee7420ff8ba-kube-api-access-2d4pz\") pod \"redhat-operators-nts28\" (UID: \"1a543154-2164-4e7d-ac52-9ee7420ff8ba\") " pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.256067 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a543154-2164-4e7d-ac52-9ee7420ff8ba-catalog-content\") pod \"redhat-operators-nts28\" (UID: \"1a543154-2164-4e7d-ac52-9ee7420ff8ba\") " pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.256384 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a543154-2164-4e7d-ac52-9ee7420ff8ba-utilities\") pod \"redhat-operators-nts28\" (UID: \"1a543154-2164-4e7d-ac52-9ee7420ff8ba\") " pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.291133 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d4pz\" (UniqueName: \"kubernetes.io/projected/1a543154-2164-4e7d-ac52-9ee7420ff8ba-kube-api-access-2d4pz\") pod \"redhat-operators-nts28\" (UID: \"1a543154-2164-4e7d-ac52-9ee7420ff8ba\") " pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.401660 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:05:48 crc kubenswrapper[4766]: I1209 05:05:48.986628 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nts28"] Dec 09 05:05:49 crc kubenswrapper[4766]: I1209 05:05:49.142146 4766 generic.go:334] "Generic (PLEG): container finished" podID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerID="a557dfdca0cb4ae0fe94d90dd1dc16fbc6932cf8e66c44e50f2b3fd103e4fc72" exitCode=0 Dec 09 05:05:49 crc kubenswrapper[4766]: I1209 05:05:49.143250 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6ljn" event={"ID":"2869bea5-d149-40de-9c1e-ecefe14d7d7b","Type":"ContainerDied","Data":"a557dfdca0cb4ae0fe94d90dd1dc16fbc6932cf8e66c44e50f2b3fd103e4fc72"} Dec 09 05:05:49 crc kubenswrapper[4766]: I1209 05:05:49.144539 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nts28" event={"ID":"1a543154-2164-4e7d-ac52-9ee7420ff8ba","Type":"ContainerStarted","Data":"0daaf22d5fc0a5880d5438d462f79dbd0f81be5a37887c8243a6fc7c8428a3ff"} Dec 09 05:05:50 crc kubenswrapper[4766]: I1209 05:05:50.156687 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6ljn" event={"ID":"2869bea5-d149-40de-9c1e-ecefe14d7d7b","Type":"ContainerStarted","Data":"1ec594fbf00567f0da489a26f2d312ea545825054ed6478b4be6184de82edc5c"} Dec 09 05:05:50 crc kubenswrapper[4766]: I1209 05:05:50.159064 4766 generic.go:334] "Generic (PLEG): container finished" podID="1a543154-2164-4e7d-ac52-9ee7420ff8ba" containerID="1567f629cb1dec278e59062b1a7e252edd91b16c54e85f93422900e5e12df830" exitCode=0 Dec 09 05:05:50 crc kubenswrapper[4766]: I1209 05:05:50.159110 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nts28" event={"ID":"1a543154-2164-4e7d-ac52-9ee7420ff8ba","Type":"ContainerDied","Data":"1567f629cb1dec278e59062b1a7e252edd91b16c54e85f93422900e5e12df830"} Dec 09 05:05:50 crc kubenswrapper[4766]: I1209 05:05:50.189194 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l6ljn" podStartSLOduration=2.620036408 podStartE2EDuration="5.189175961s" podCreationTimestamp="2025-12-09 05:05:45 +0000 UTC" firstStartedPulling="2025-12-09 05:05:47.108471584 +0000 UTC m=+6828.817777040" lastFinishedPulling="2025-12-09 05:05:49.677611167 +0000 UTC m=+6831.386916593" observedRunningTime="2025-12-09 05:05:50.183126347 +0000 UTC m=+6831.892431783" watchObservedRunningTime="2025-12-09 05:05:50.189175961 +0000 UTC m=+6831.898481397" Dec 09 05:05:56 crc kubenswrapper[4766]: I1209 05:05:56.012155 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:56 crc kubenswrapper[4766]: I1209 05:05:56.013805 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:56 crc kubenswrapper[4766]: I1209 05:05:56.080432 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:56 crc kubenswrapper[4766]: I1209 05:05:56.293075 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:56 crc kubenswrapper[4766]: I1209 05:05:56.340866 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6ljn"] Dec 09 05:05:57 crc kubenswrapper[4766]: I1209 05:05:57.041920 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-2e57-account-create-update-74nhw"] Dec 09 05:05:57 crc kubenswrapper[4766]: I1209 05:05:57.051090 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-vn8fq"] Dec 09 05:05:57 crc kubenswrapper[4766]: I1209 05:05:57.058567 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-2e57-account-create-update-74nhw"] Dec 09 05:05:57 crc kubenswrapper[4766]: I1209 05:05:57.065544 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-vn8fq"] Dec 09 05:05:58 crc kubenswrapper[4766]: I1209 05:05:58.247775 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l6ljn" podUID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerName="registry-server" containerID="cri-o://1ec594fbf00567f0da489a26f2d312ea545825054ed6478b4be6184de82edc5c" gracePeriod=2 Dec 09 05:05:58 crc kubenswrapper[4766]: I1209 05:05:58.878497 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57276a31-2e51-48c1-8740-f2e9fcec30f3" path="/var/lib/kubelet/pods/57276a31-2e51-48c1-8740-f2e9fcec30f3/volumes" Dec 09 05:05:58 crc kubenswrapper[4766]: I1209 05:05:58.883199 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d12df1eb-da3e-4256-8889-4885f8e0286d" path="/var/lib/kubelet/pods/d12df1eb-da3e-4256-8889-4885f8e0286d/volumes" Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.275484 4766 generic.go:334] "Generic (PLEG): container finished" podID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerID="1ec594fbf00567f0da489a26f2d312ea545825054ed6478b4be6184de82edc5c" exitCode=0 Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.275843 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6ljn" event={"ID":"2869bea5-d149-40de-9c1e-ecefe14d7d7b","Type":"ContainerDied","Data":"1ec594fbf00567f0da489a26f2d312ea545825054ed6478b4be6184de82edc5c"} Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.752022 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.781371 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcq62\" (UniqueName: \"kubernetes.io/projected/2869bea5-d149-40de-9c1e-ecefe14d7d7b-kube-api-access-pcq62\") pod \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.781861 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-catalog-content\") pod \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.781912 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-utilities\") pod \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\" (UID: \"2869bea5-d149-40de-9c1e-ecefe14d7d7b\") " Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.783283 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-utilities" (OuterVolumeSpecName: "utilities") pod "2869bea5-d149-40de-9c1e-ecefe14d7d7b" (UID: "2869bea5-d149-40de-9c1e-ecefe14d7d7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.791632 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2869bea5-d149-40de-9c1e-ecefe14d7d7b-kube-api-access-pcq62" (OuterVolumeSpecName: "kube-api-access-pcq62") pod "2869bea5-d149-40de-9c1e-ecefe14d7d7b" (UID: "2869bea5-d149-40de-9c1e-ecefe14d7d7b"). InnerVolumeSpecName "kube-api-access-pcq62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.842540 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2869bea5-d149-40de-9c1e-ecefe14d7d7b" (UID: "2869bea5-d149-40de-9c1e-ecefe14d7d7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.884492 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcq62\" (UniqueName: \"kubernetes.io/projected/2869bea5-d149-40de-9c1e-ecefe14d7d7b-kube-api-access-pcq62\") on node \"crc\" DevicePath \"\"" Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.884530 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:05:59 crc kubenswrapper[4766]: I1209 05:05:59.884539 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2869bea5-d149-40de-9c1e-ecefe14d7d7b-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:06:00 crc kubenswrapper[4766]: I1209 05:06:00.292077 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nts28" event={"ID":"1a543154-2164-4e7d-ac52-9ee7420ff8ba","Type":"ContainerStarted","Data":"a6944adf96792965724ce42394611496c40503c18647932b373a58de4d1726e3"} Dec 09 05:06:00 crc kubenswrapper[4766]: I1209 05:06:00.298273 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6ljn" event={"ID":"2869bea5-d149-40de-9c1e-ecefe14d7d7b","Type":"ContainerDied","Data":"5331d91f5afc8d9f7bdc3e07875a46f184ddf6ad6c0b3336c6c5d1938fb857b2"} Dec 09 05:06:00 crc kubenswrapper[4766]: I1209 05:06:00.298344 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6ljn" Dec 09 05:06:00 crc kubenswrapper[4766]: I1209 05:06:00.298364 4766 scope.go:117] "RemoveContainer" containerID="1ec594fbf00567f0da489a26f2d312ea545825054ed6478b4be6184de82edc5c" Dec 09 05:06:00 crc kubenswrapper[4766]: I1209 05:06:00.338161 4766 scope.go:117] "RemoveContainer" containerID="a557dfdca0cb4ae0fe94d90dd1dc16fbc6932cf8e66c44e50f2b3fd103e4fc72" Dec 09 05:06:00 crc kubenswrapper[4766]: I1209 05:06:00.362553 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6ljn"] Dec 09 05:06:00 crc kubenswrapper[4766]: I1209 05:06:00.371333 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l6ljn"] Dec 09 05:06:00 crc kubenswrapper[4766]: I1209 05:06:00.384819 4766 scope.go:117] "RemoveContainer" containerID="98edf4c7872c55ef8b163eaedaf5c67ddc97ca8ff69111618bbf0bdb8e5e36fc" Dec 09 05:06:00 crc kubenswrapper[4766]: I1209 05:06:00.854921 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" path="/var/lib/kubelet/pods/2869bea5-d149-40de-9c1e-ecefe14d7d7b/volumes" Dec 09 05:06:02 crc kubenswrapper[4766]: I1209 05:06:02.327923 4766 generic.go:334] "Generic (PLEG): container finished" podID="1a543154-2164-4e7d-ac52-9ee7420ff8ba" containerID="a6944adf96792965724ce42394611496c40503c18647932b373a58de4d1726e3" exitCode=0 Dec 09 05:06:02 crc kubenswrapper[4766]: I1209 05:06:02.328002 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nts28" event={"ID":"1a543154-2164-4e7d-ac52-9ee7420ff8ba","Type":"ContainerDied","Data":"a6944adf96792965724ce42394611496c40503c18647932b373a58de4d1726e3"} Dec 09 05:06:04 crc kubenswrapper[4766]: I1209 05:06:04.357873 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nts28" event={"ID":"1a543154-2164-4e7d-ac52-9ee7420ff8ba","Type":"ContainerStarted","Data":"d0e7f83b6e819685e33f8e0a8fa078e447b5227c728771c9c917095dd2ddde3b"} Dec 09 05:06:04 crc kubenswrapper[4766]: I1209 05:06:04.392876 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nts28" podStartSLOduration=2.734026504 podStartE2EDuration="16.392858468s" podCreationTimestamp="2025-12-09 05:05:48 +0000 UTC" firstStartedPulling="2025-12-09 05:05:50.160832234 +0000 UTC m=+6831.870137660" lastFinishedPulling="2025-12-09 05:06:03.819664188 +0000 UTC m=+6845.528969624" observedRunningTime="2025-12-09 05:06:04.387597296 +0000 UTC m=+6846.096902722" watchObservedRunningTime="2025-12-09 05:06:04.392858468 +0000 UTC m=+6846.102163894" Dec 09 05:06:07 crc kubenswrapper[4766]: I1209 05:06:07.316614 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:06:07 crc kubenswrapper[4766]: I1209 05:06:07.317383 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:06:07 crc kubenswrapper[4766]: I1209 05:06:07.317441 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:06:07 crc kubenswrapper[4766]: I1209 05:06:07.318395 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d021bcfbc8a8df6c3805341afe498144b1ef33e23b7238e24e497487bc503d42"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:06:07 crc kubenswrapper[4766]: I1209 05:06:07.318453 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://d021bcfbc8a8df6c3805341afe498144b1ef33e23b7238e24e497487bc503d42" gracePeriod=600 Dec 09 05:06:08 crc kubenswrapper[4766]: I1209 05:06:08.402866 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:06:08 crc kubenswrapper[4766]: I1209 05:06:08.403628 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:06:08 crc kubenswrapper[4766]: I1209 05:06:08.408689 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="d021bcfbc8a8df6c3805341afe498144b1ef33e23b7238e24e497487bc503d42" exitCode=0 Dec 09 05:06:08 crc kubenswrapper[4766]: I1209 05:06:08.408721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"d021bcfbc8a8df6c3805341afe498144b1ef33e23b7238e24e497487bc503d42"} Dec 09 05:06:08 crc kubenswrapper[4766]: I1209 05:06:08.408769 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04"} Dec 09 05:06:08 crc kubenswrapper[4766]: I1209 05:06:08.408799 4766 scope.go:117] "RemoveContainer" containerID="95ec58b582d1a2cebc8e28e4bdbc7e02876d1d8055b0e3640fe2da7d132a589b" Dec 09 05:06:09 crc kubenswrapper[4766]: I1209 05:06:09.487792 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nts28" podUID="1a543154-2164-4e7d-ac52-9ee7420ff8ba" containerName="registry-server" probeResult="failure" output=< Dec 09 05:06:09 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:06:09 crc kubenswrapper[4766]: > Dec 09 05:06:12 crc kubenswrapper[4766]: I1209 05:06:12.033517 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-cjmsh"] Dec 09 05:06:12 crc kubenswrapper[4766]: I1209 05:06:12.043894 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-cjmsh"] Dec 09 05:06:12 crc kubenswrapper[4766]: I1209 05:06:12.855089 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d1d7bd-b62d-4f48-b72b-b9de5ecd611e" path="/var/lib/kubelet/pods/66d1d7bd-b62d-4f48-b72b-b9de5ecd611e/volumes" Dec 09 05:06:18 crc kubenswrapper[4766]: I1209 05:06:18.468267 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:06:18 crc kubenswrapper[4766]: I1209 05:06:18.535920 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nts28" Dec 09 05:06:18 crc kubenswrapper[4766]: I1209 05:06:18.617487 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nts28"] Dec 09 05:06:18 crc kubenswrapper[4766]: I1209 05:06:18.723123 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2gxpb"] Dec 09 05:06:18 crc kubenswrapper[4766]: I1209 05:06:18.723453 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2gxpb" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerName="registry-server" containerID="cri-o://3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144" gracePeriod=2 Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.247877 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.432750 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7pmd\" (UniqueName: \"kubernetes.io/projected/9fe686c1-516d-43be-8a61-776ff1f64cd1-kube-api-access-p7pmd\") pod \"9fe686c1-516d-43be-8a61-776ff1f64cd1\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.433085 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-utilities\") pod \"9fe686c1-516d-43be-8a61-776ff1f64cd1\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.433145 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-catalog-content\") pod \"9fe686c1-516d-43be-8a61-776ff1f64cd1\" (UID: \"9fe686c1-516d-43be-8a61-776ff1f64cd1\") " Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.438746 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-utilities" (OuterVolumeSpecName: "utilities") pod "9fe686c1-516d-43be-8a61-776ff1f64cd1" (UID: "9fe686c1-516d-43be-8a61-776ff1f64cd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.450836 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fe686c1-516d-43be-8a61-776ff1f64cd1-kube-api-access-p7pmd" (OuterVolumeSpecName: "kube-api-access-p7pmd") pod "9fe686c1-516d-43be-8a61-776ff1f64cd1" (UID: "9fe686c1-516d-43be-8a61-776ff1f64cd1"). InnerVolumeSpecName "kube-api-access-p7pmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.536029 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.536063 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7pmd\" (UniqueName: \"kubernetes.io/projected/9fe686c1-516d-43be-8a61-776ff1f64cd1-kube-api-access-p7pmd\") on node \"crc\" DevicePath \"\"" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.553403 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fe686c1-516d-43be-8a61-776ff1f64cd1" (UID: "9fe686c1-516d-43be-8a61-776ff1f64cd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.560294 4766 generic.go:334] "Generic (PLEG): container finished" podID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerID="3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144" exitCode=0 Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.560424 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gxpb" event={"ID":"9fe686c1-516d-43be-8a61-776ff1f64cd1","Type":"ContainerDied","Data":"3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144"} Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.560495 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gxpb" event={"ID":"9fe686c1-516d-43be-8a61-776ff1f64cd1","Type":"ContainerDied","Data":"c4b4d013761a9f789468d9a048656640d1b995496ec2b4d81c328121d8fbc4b1"} Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.560530 4766 scope.go:117] "RemoveContainer" containerID="3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.560627 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gxpb" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.584399 4766 scope.go:117] "RemoveContainer" containerID="46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.605436 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2gxpb"] Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.613754 4766 scope.go:117] "RemoveContainer" containerID="f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.616947 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2gxpb"] Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.637869 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe686c1-516d-43be-8a61-776ff1f64cd1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.651310 4766 scope.go:117] "RemoveContainer" containerID="3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144" Dec 09 05:06:19 crc kubenswrapper[4766]: E1209 05:06:19.651831 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144\": container with ID starting with 3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144 not found: ID does not exist" containerID="3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.651867 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144"} err="failed to get container status \"3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144\": rpc error: code = NotFound desc = could not find container \"3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144\": container with ID starting with 3783accaf29a314102e3b13dfa008425bf0ecd09b01cc1b1dada16dae476a144 not found: ID does not exist" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.651887 4766 scope.go:117] "RemoveContainer" containerID="46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74" Dec 09 05:06:19 crc kubenswrapper[4766]: E1209 05:06:19.652330 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74\": container with ID starting with 46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74 not found: ID does not exist" containerID="46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.652392 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74"} err="failed to get container status \"46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74\": rpc error: code = NotFound desc = could not find container \"46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74\": container with ID starting with 46013ab829c3e479e2b1c0b9df4b5befffb5e6cb946d9bd1219afdd822cccd74 not found: ID does not exist" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.652435 4766 scope.go:117] "RemoveContainer" containerID="f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6" Dec 09 05:06:19 crc kubenswrapper[4766]: E1209 05:06:19.653322 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6\": container with ID starting with f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6 not found: ID does not exist" containerID="f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6" Dec 09 05:06:19 crc kubenswrapper[4766]: I1209 05:06:19.653357 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6"} err="failed to get container status \"f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6\": rpc error: code = NotFound desc = could not find container \"f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6\": container with ID starting with f3a054abc262071535c5321a8401256b910dc70e9051427ede37805fcbfc5db6 not found: ID does not exist" Dec 09 05:06:20 crc kubenswrapper[4766]: I1209 05:06:20.864570 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" path="/var/lib/kubelet/pods/9fe686c1-516d-43be-8a61-776ff1f64cd1/volumes" Dec 09 05:06:25 crc kubenswrapper[4766]: I1209 05:06:25.639729 4766 scope.go:117] "RemoveContainer" containerID="89dccbeaf58b046de7e35de63836973239179f8354f69e6dbf2ec1959dcf6029" Dec 09 05:06:25 crc kubenswrapper[4766]: I1209 05:06:25.667630 4766 scope.go:117] "RemoveContainer" containerID="cb536fafaeaf8819edd8ffef7694e5f18ac4237ca4383145032004d7ff8a01e2" Dec 09 05:06:25 crc kubenswrapper[4766]: I1209 05:06:25.726906 4766 scope.go:117] "RemoveContainer" containerID="43efc8fbaea50861dce629607ecc27ca03fba73e8f625129ff4550aae0861893" Dec 09 05:08:07 crc kubenswrapper[4766]: I1209 05:08:07.316187 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:08:07 crc kubenswrapper[4766]: I1209 05:08:07.316820 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:08:31 crc kubenswrapper[4766]: I1209 05:08:31.061849 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-ec05-account-create-update-ls9kv"] Dec 09 05:08:31 crc kubenswrapper[4766]: I1209 05:08:31.075277 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-jklmr"] Dec 09 05:08:31 crc kubenswrapper[4766]: I1209 05:08:31.089035 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-ec05-account-create-update-ls9kv"] Dec 09 05:08:31 crc kubenswrapper[4766]: I1209 05:08:31.102335 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-jklmr"] Dec 09 05:08:32 crc kubenswrapper[4766]: I1209 05:08:32.854469 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0b5b8f-1a7e-4281-be26-72dd9e91573e" path="/var/lib/kubelet/pods/5b0b5b8f-1a7e-4281-be26-72dd9e91573e/volumes" Dec 09 05:08:32 crc kubenswrapper[4766]: I1209 05:08:32.856327 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0881fad-40f0-492b-ac13-742c63e7f260" path="/var/lib/kubelet/pods/b0881fad-40f0-492b-ac13-742c63e7f260/volumes" Dec 09 05:08:37 crc kubenswrapper[4766]: I1209 05:08:37.317163 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:08:37 crc kubenswrapper[4766]: I1209 05:08:37.317834 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.716560 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5b9sr"] Dec 09 05:08:39 crc kubenswrapper[4766]: E1209 05:08:39.717616 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerName="extract-utilities" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.717633 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerName="extract-utilities" Dec 09 05:08:39 crc kubenswrapper[4766]: E1209 05:08:39.717670 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerName="extract-content" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.717678 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerName="extract-content" Dec 09 05:08:39 crc kubenswrapper[4766]: E1209 05:08:39.717703 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerName="registry-server" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.717710 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerName="registry-server" Dec 09 05:08:39 crc kubenswrapper[4766]: E1209 05:08:39.717721 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerName="registry-server" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.717728 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerName="registry-server" Dec 09 05:08:39 crc kubenswrapper[4766]: E1209 05:08:39.717742 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerName="extract-utilities" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.717751 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerName="extract-utilities" Dec 09 05:08:39 crc kubenswrapper[4766]: E1209 05:08:39.717766 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerName="extract-content" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.717772 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerName="extract-content" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.718025 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fe686c1-516d-43be-8a61-776ff1f64cd1" containerName="registry-server" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.718056 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2869bea5-d149-40de-9c1e-ecefe14d7d7b" containerName="registry-server" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.719929 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.733388 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5b9sr"] Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.878288 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-catalog-content\") pod \"community-operators-5b9sr\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.878464 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmwdb\" (UniqueName: \"kubernetes.io/projected/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-kube-api-access-hmwdb\") pod \"community-operators-5b9sr\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.878549 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-utilities\") pod \"community-operators-5b9sr\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.981105 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-utilities\") pod \"community-operators-5b9sr\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.981323 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-catalog-content\") pod \"community-operators-5b9sr\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.981506 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmwdb\" (UniqueName: \"kubernetes.io/projected/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-kube-api-access-hmwdb\") pod \"community-operators-5b9sr\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.981921 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-catalog-content\") pod \"community-operators-5b9sr\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:39 crc kubenswrapper[4766]: I1209 05:08:39.982272 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-utilities\") pod \"community-operators-5b9sr\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:40 crc kubenswrapper[4766]: I1209 05:08:40.004656 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmwdb\" (UniqueName: \"kubernetes.io/projected/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-kube-api-access-hmwdb\") pod \"community-operators-5b9sr\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:40 crc kubenswrapper[4766]: I1209 05:08:40.059398 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:40 crc kubenswrapper[4766]: I1209 05:08:40.556489 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5b9sr"] Dec 09 05:08:41 crc kubenswrapper[4766]: I1209 05:08:41.237688 4766 generic.go:334] "Generic (PLEG): container finished" podID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerID="842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9" exitCode=0 Dec 09 05:08:41 crc kubenswrapper[4766]: I1209 05:08:41.237841 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5b9sr" event={"ID":"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f","Type":"ContainerDied","Data":"842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9"} Dec 09 05:08:41 crc kubenswrapper[4766]: I1209 05:08:41.238169 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5b9sr" event={"ID":"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f","Type":"ContainerStarted","Data":"0198d889642917d087656aa4f1dc934266c23136d42e9f1aed2c95a71a8b3105"} Dec 09 05:08:43 crc kubenswrapper[4766]: I1209 05:08:43.051668 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-xczbt"] Dec 09 05:08:43 crc kubenswrapper[4766]: I1209 05:08:43.064932 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-xczbt"] Dec 09 05:08:43 crc kubenswrapper[4766]: I1209 05:08:43.265134 4766 generic.go:334] "Generic (PLEG): container finished" podID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerID="3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41" exitCode=0 Dec 09 05:08:43 crc kubenswrapper[4766]: I1209 05:08:43.265200 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5b9sr" event={"ID":"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f","Type":"ContainerDied","Data":"3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41"} Dec 09 05:08:44 crc kubenswrapper[4766]: I1209 05:08:44.275793 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5b9sr" event={"ID":"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f","Type":"ContainerStarted","Data":"35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8"} Dec 09 05:08:44 crc kubenswrapper[4766]: I1209 05:08:44.305728 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5b9sr" podStartSLOduration=2.850721957 podStartE2EDuration="5.305709362s" podCreationTimestamp="2025-12-09 05:08:39 +0000 UTC" firstStartedPulling="2025-12-09 05:08:41.240328232 +0000 UTC m=+7002.949633658" lastFinishedPulling="2025-12-09 05:08:43.695315597 +0000 UTC m=+7005.404621063" observedRunningTime="2025-12-09 05:08:44.301797247 +0000 UTC m=+7006.011102703" watchObservedRunningTime="2025-12-09 05:08:44.305709362 +0000 UTC m=+7006.015014798" Dec 09 05:08:44 crc kubenswrapper[4766]: I1209 05:08:44.853757 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef90510-1996-4325-9bf1-2b68a0c95146" path="/var/lib/kubelet/pods/7ef90510-1996-4325-9bf1-2b68a0c95146/volumes" Dec 09 05:08:50 crc kubenswrapper[4766]: I1209 05:08:50.060574 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:50 crc kubenswrapper[4766]: I1209 05:08:50.061202 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:50 crc kubenswrapper[4766]: I1209 05:08:50.150280 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:50 crc kubenswrapper[4766]: I1209 05:08:50.445086 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:50 crc kubenswrapper[4766]: I1209 05:08:50.509173 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5b9sr"] Dec 09 05:08:52 crc kubenswrapper[4766]: I1209 05:08:52.381104 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5b9sr" podUID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerName="registry-server" containerID="cri-o://35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8" gracePeriod=2 Dec 09 05:08:52 crc kubenswrapper[4766]: I1209 05:08:52.859338 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:52 crc kubenswrapper[4766]: I1209 05:08:52.895527 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-utilities\") pod \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " Dec 09 05:08:52 crc kubenswrapper[4766]: I1209 05:08:52.896251 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-utilities" (OuterVolumeSpecName: "utilities") pod "e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" (UID: "e274cac9-c7b5-45eb-be2c-5e6aecd7b69f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:08:52 crc kubenswrapper[4766]: I1209 05:08:52.896515 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmwdb\" (UniqueName: \"kubernetes.io/projected/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-kube-api-access-hmwdb\") pod \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " Dec 09 05:08:52 crc kubenswrapper[4766]: I1209 05:08:52.897938 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:08:52 crc kubenswrapper[4766]: I1209 05:08:52.902354 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-kube-api-access-hmwdb" (OuterVolumeSpecName: "kube-api-access-hmwdb") pod "e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" (UID: "e274cac9-c7b5-45eb-be2c-5e6aecd7b69f"). InnerVolumeSpecName "kube-api-access-hmwdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.000411 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-catalog-content\") pod \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\" (UID: \"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f\") " Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.001910 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmwdb\" (UniqueName: \"kubernetes.io/projected/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-kube-api-access-hmwdb\") on node \"crc\" DevicePath \"\"" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.400843 4766 generic.go:334] "Generic (PLEG): container finished" podID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerID="35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8" exitCode=0 Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.400909 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5b9sr" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.400980 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5b9sr" event={"ID":"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f","Type":"ContainerDied","Data":"35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8"} Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.401820 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5b9sr" event={"ID":"e274cac9-c7b5-45eb-be2c-5e6aecd7b69f","Type":"ContainerDied","Data":"0198d889642917d087656aa4f1dc934266c23136d42e9f1aed2c95a71a8b3105"} Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.401860 4766 scope.go:117] "RemoveContainer" containerID="35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.437862 4766 scope.go:117] "RemoveContainer" containerID="3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.475777 4766 scope.go:117] "RemoveContainer" containerID="842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.548355 4766 scope.go:117] "RemoveContainer" containerID="35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8" Dec 09 05:08:53 crc kubenswrapper[4766]: E1209 05:08:53.548805 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8\": container with ID starting with 35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8 not found: ID does not exist" containerID="35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.548858 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8"} err="failed to get container status \"35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8\": rpc error: code = NotFound desc = could not find container \"35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8\": container with ID starting with 35b895c822eaf852497140c7e2daed25d6762494c190df3861adaa8c00159be8 not found: ID does not exist" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.548888 4766 scope.go:117] "RemoveContainer" containerID="3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41" Dec 09 05:08:53 crc kubenswrapper[4766]: E1209 05:08:53.549420 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41\": container with ID starting with 3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41 not found: ID does not exist" containerID="3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.549456 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41"} err="failed to get container status \"3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41\": rpc error: code = NotFound desc = could not find container \"3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41\": container with ID starting with 3e1cdfe27acc9f336dbac08b099b5c1c4a744babff56d2bddf784d8fdc6cad41 not found: ID does not exist" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.549474 4766 scope.go:117] "RemoveContainer" containerID="842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9" Dec 09 05:08:53 crc kubenswrapper[4766]: E1209 05:08:53.549859 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9\": container with ID starting with 842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9 not found: ID does not exist" containerID="842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.549952 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9"} err="failed to get container status \"842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9\": rpc error: code = NotFound desc = could not find container \"842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9\": container with ID starting with 842df0239e7d813b79fa479a0ab521db6ff75b057cdb435cf7e402ac51cae1c9 not found: ID does not exist" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.857054 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" (UID: "e274cac9-c7b5-45eb-be2c-5e6aecd7b69f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:08:53 crc kubenswrapper[4766]: I1209 05:08:53.920870 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:08:54 crc kubenswrapper[4766]: I1209 05:08:54.048453 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5b9sr"] Dec 09 05:08:54 crc kubenswrapper[4766]: I1209 05:08:54.060419 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5b9sr"] Dec 09 05:08:54 crc kubenswrapper[4766]: I1209 05:08:54.852947 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" path="/var/lib/kubelet/pods/e274cac9-c7b5-45eb-be2c-5e6aecd7b69f/volumes" Dec 09 05:09:05 crc kubenswrapper[4766]: I1209 05:09:05.056459 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-r2zqs"] Dec 09 05:09:05 crc kubenswrapper[4766]: I1209 05:09:05.071096 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-r2zqs"] Dec 09 05:09:05 crc kubenswrapper[4766]: I1209 05:09:05.080351 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-58d2-account-create-update-hwmhp"] Dec 09 05:09:05 crc kubenswrapper[4766]: I1209 05:09:05.117653 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-58d2-account-create-update-hwmhp"] Dec 09 05:09:06 crc kubenswrapper[4766]: I1209 05:09:06.871117 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86805adc-8867-42c3-969a-9c23a0f59e17" path="/var/lib/kubelet/pods/86805adc-8867-42c3-969a-9c23a0f59e17/volumes" Dec 09 05:09:06 crc kubenswrapper[4766]: I1209 05:09:06.873928 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e7d30c-dd55-4878-bfd6-9916479661c4" path="/var/lib/kubelet/pods/a0e7d30c-dd55-4878-bfd6-9916479661c4/volumes" Dec 09 05:09:07 crc kubenswrapper[4766]: I1209 05:09:07.316690 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:09:07 crc kubenswrapper[4766]: I1209 05:09:07.316742 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:09:07 crc kubenswrapper[4766]: I1209 05:09:07.316781 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:09:07 crc kubenswrapper[4766]: I1209 05:09:07.317542 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:09:07 crc kubenswrapper[4766]: I1209 05:09:07.317591 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" gracePeriod=600 Dec 09 05:09:07 crc kubenswrapper[4766]: E1209 05:09:07.444703 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:09:07 crc kubenswrapper[4766]: I1209 05:09:07.600727 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" exitCode=0 Dec 09 05:09:07 crc kubenswrapper[4766]: I1209 05:09:07.600771 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04"} Dec 09 05:09:07 crc kubenswrapper[4766]: I1209 05:09:07.600805 4766 scope.go:117] "RemoveContainer" containerID="d021bcfbc8a8df6c3805341afe498144b1ef33e23b7238e24e497487bc503d42" Dec 09 05:09:07 crc kubenswrapper[4766]: I1209 05:09:07.601842 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:09:07 crc kubenswrapper[4766]: E1209 05:09:07.602444 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:09:15 crc kubenswrapper[4766]: I1209 05:09:15.038419 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-hz65c"] Dec 09 05:09:15 crc kubenswrapper[4766]: I1209 05:09:15.050254 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-hz65c"] Dec 09 05:09:16 crc kubenswrapper[4766]: I1209 05:09:16.863158 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501e3fae-1369-4582-b9a5-af19e30ad540" path="/var/lib/kubelet/pods/501e3fae-1369-4582-b9a5-af19e30ad540/volumes" Dec 09 05:09:22 crc kubenswrapper[4766]: I1209 05:09:22.840426 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:09:22 crc kubenswrapper[4766]: E1209 05:09:22.841393 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:09:25 crc kubenswrapper[4766]: I1209 05:09:25.906888 4766 scope.go:117] "RemoveContainer" containerID="958910ec0b9e1c9d51a524072f28c4417b5ba1b2dc63a066bde08d8ea860bd94" Dec 09 05:09:25 crc kubenswrapper[4766]: I1209 05:09:25.953974 4766 scope.go:117] "RemoveContainer" containerID="de522f19ed5448ffb6f0ed59bbe038f1c300b2ed636208004edb4ca7c9fb83e0" Dec 09 05:09:26 crc kubenswrapper[4766]: I1209 05:09:26.024647 4766 scope.go:117] "RemoveContainer" containerID="e451e0a3fa8bfefe38cb73a790738200f8dd7a5de5b3e02e78d024d38a1a1b67" Dec 09 05:09:26 crc kubenswrapper[4766]: I1209 05:09:26.078300 4766 scope.go:117] "RemoveContainer" containerID="b281ecd66eeb9d569a8f2727632b756707525dd6b269a53cef9bcf46e62ec12b" Dec 09 05:09:26 crc kubenswrapper[4766]: I1209 05:09:26.126600 4766 scope.go:117] "RemoveContainer" containerID="b8190f6d1d9dc2c569155f804e6bf83370fda4966188c67cfc7705adbe223fa2" Dec 09 05:09:26 crc kubenswrapper[4766]: I1209 05:09:26.189643 4766 scope.go:117] "RemoveContainer" containerID="cef5bb8de4608acd6783f4e83886ca0c6b7a4a201e32898a826f5cf76916c954" Dec 09 05:09:34 crc kubenswrapper[4766]: I1209 05:09:34.839504 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:09:34 crc kubenswrapper[4766]: E1209 05:09:34.840345 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:09:48 crc kubenswrapper[4766]: I1209 05:09:48.848382 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:09:48 crc kubenswrapper[4766]: E1209 05:09:48.849608 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:10:03 crc kubenswrapper[4766]: I1209 05:10:03.840305 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:10:03 crc kubenswrapper[4766]: E1209 05:10:03.841753 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:10:15 crc kubenswrapper[4766]: I1209 05:10:15.839999 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:10:15 crc kubenswrapper[4766]: E1209 05:10:15.840982 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:10:27 crc kubenswrapper[4766]: I1209 05:10:27.839675 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:10:27 crc kubenswrapper[4766]: E1209 05:10:27.840534 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:10:40 crc kubenswrapper[4766]: I1209 05:10:40.840592 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:10:40 crc kubenswrapper[4766]: E1209 05:10:40.841608 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:10:53 crc kubenswrapper[4766]: I1209 05:10:53.840016 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:10:53 crc kubenswrapper[4766]: E1209 05:10:53.840784 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:11:06 crc kubenswrapper[4766]: I1209 05:11:06.840074 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:11:06 crc kubenswrapper[4766]: E1209 05:11:06.840898 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:11:21 crc kubenswrapper[4766]: I1209 05:11:21.839850 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:11:21 crc kubenswrapper[4766]: E1209 05:11:21.840815 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:11:34 crc kubenswrapper[4766]: I1209 05:11:34.839830 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:11:34 crc kubenswrapper[4766]: E1209 05:11:34.841140 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:11:49 crc kubenswrapper[4766]: I1209 05:11:49.839812 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:11:49 crc kubenswrapper[4766]: E1209 05:11:49.842076 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:12:04 crc kubenswrapper[4766]: I1209 05:12:04.839261 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:12:04 crc kubenswrapper[4766]: E1209 05:12:04.840052 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:12:15 crc kubenswrapper[4766]: I1209 05:12:15.839763 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:12:15 crc kubenswrapper[4766]: E1209 05:12:15.840915 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:12:26 crc kubenswrapper[4766]: I1209 05:12:26.095746 4766 generic.go:334] "Generic (PLEG): container finished" podID="b536acce-84ce-48e5-b47e-9c68a26c82a7" containerID="0444f6f4c5d254330997098faf1247a4b82994a27409efa2213b1f0c15380315" exitCode=0 Dec 09 05:12:26 crc kubenswrapper[4766]: I1209 05:12:26.095835 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" event={"ID":"b536acce-84ce-48e5-b47e-9c68a26c82a7","Type":"ContainerDied","Data":"0444f6f4c5d254330997098faf1247a4b82994a27409efa2213b1f0c15380315"} Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.604420 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.649422 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ceph\") pod \"b536acce-84ce-48e5-b47e-9c68a26c82a7\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.649776 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-tripleo-cleanup-combined-ca-bundle\") pod \"b536acce-84ce-48e5-b47e-9c68a26c82a7\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.649874 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dwrj\" (UniqueName: \"kubernetes.io/projected/b536acce-84ce-48e5-b47e-9c68a26c82a7-kube-api-access-5dwrj\") pod \"b536acce-84ce-48e5-b47e-9c68a26c82a7\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.649956 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-inventory\") pod \"b536acce-84ce-48e5-b47e-9c68a26c82a7\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.650164 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ssh-key\") pod \"b536acce-84ce-48e5-b47e-9c68a26c82a7\" (UID: \"b536acce-84ce-48e5-b47e-9c68a26c82a7\") " Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.664281 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b536acce-84ce-48e5-b47e-9c68a26c82a7-kube-api-access-5dwrj" (OuterVolumeSpecName: "kube-api-access-5dwrj") pod "b536acce-84ce-48e5-b47e-9c68a26c82a7" (UID: "b536acce-84ce-48e5-b47e-9c68a26c82a7"). InnerVolumeSpecName "kube-api-access-5dwrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.664684 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ceph" (OuterVolumeSpecName: "ceph") pod "b536acce-84ce-48e5-b47e-9c68a26c82a7" (UID: "b536acce-84ce-48e5-b47e-9c68a26c82a7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.667367 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "b536acce-84ce-48e5-b47e-9c68a26c82a7" (UID: "b536acce-84ce-48e5-b47e-9c68a26c82a7"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.684706 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-inventory" (OuterVolumeSpecName: "inventory") pod "b536acce-84ce-48e5-b47e-9c68a26c82a7" (UID: "b536acce-84ce-48e5-b47e-9c68a26c82a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.685760 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b536acce-84ce-48e5-b47e-9c68a26c82a7" (UID: "b536acce-84ce-48e5-b47e-9c68a26c82a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.753051 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.753097 4766 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.753113 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dwrj\" (UniqueName: \"kubernetes.io/projected/b536acce-84ce-48e5-b47e-9c68a26c82a7-kube-api-access-5dwrj\") on node \"crc\" DevicePath \"\"" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.753162 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.753175 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b536acce-84ce-48e5-b47e-9c68a26c82a7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:12:27 crc kubenswrapper[4766]: I1209 05:12:27.839310 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:12:27 crc kubenswrapper[4766]: E1209 05:12:27.839622 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:12:28 crc kubenswrapper[4766]: I1209 05:12:28.130063 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" event={"ID":"b536acce-84ce-48e5-b47e-9c68a26c82a7","Type":"ContainerDied","Data":"596a49c07e6ff263549535c22d9893e609e8781e188c3a9057abdf4c99bbd125"} Dec 09 05:12:28 crc kubenswrapper[4766]: I1209 05:12:28.130101 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="596a49c07e6ff263549535c22d9893e609e8781e188c3a9057abdf4c99bbd125" Dec 09 05:12:28 crc kubenswrapper[4766]: I1209 05:12:28.130159 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.133534 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-7nrh9"] Dec 09 05:12:35 crc kubenswrapper[4766]: E1209 05:12:35.134867 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerName="registry-server" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.134886 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerName="registry-server" Dec 09 05:12:35 crc kubenswrapper[4766]: E1209 05:12:35.134914 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerName="extract-utilities" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.134923 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerName="extract-utilities" Dec 09 05:12:35 crc kubenswrapper[4766]: E1209 05:12:35.134943 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b536acce-84ce-48e5-b47e-9c68a26c82a7" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.134954 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b536acce-84ce-48e5-b47e-9c68a26c82a7" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 09 05:12:35 crc kubenswrapper[4766]: E1209 05:12:35.134980 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerName="extract-content" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.134988 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerName="extract-content" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.135287 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e274cac9-c7b5-45eb-be2c-5e6aecd7b69f" containerName="registry-server" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.135311 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b536acce-84ce-48e5-b47e-9c68a26c82a7" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.136194 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.141666 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.141738 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.141748 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.142506 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.155095 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-7nrh9"] Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.305940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7wgz\" (UniqueName: \"kubernetes.io/projected/ac75c2a4-56e0-482c-94f6-919a66db4f5e-kube-api-access-d7wgz\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.306268 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.306405 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ceph\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.306585 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.306688 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-inventory\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.409109 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ceph\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.409526 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.409634 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-inventory\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.409803 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7wgz\" (UniqueName: \"kubernetes.io/projected/ac75c2a4-56e0-482c-94f6-919a66db4f5e-kube-api-access-d7wgz\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.409897 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.416109 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.416200 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-inventory\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.417073 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.418442 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ceph\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.434644 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7wgz\" (UniqueName: \"kubernetes.io/projected/ac75c2a4-56e0-482c-94f6-919a66db4f5e-kube-api-access-d7wgz\") pod \"bootstrap-openstack-openstack-cell1-7nrh9\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.457462 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.984314 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-7nrh9"] Dec 09 05:12:35 crc kubenswrapper[4766]: W1209 05:12:35.990719 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac75c2a4_56e0_482c_94f6_919a66db4f5e.slice/crio-fdeb53ceb1adddee96e8b85d672ce6847b06ab4d2690d2bb22b3ec95ec41d5fc WatchSource:0}: Error finding container fdeb53ceb1adddee96e8b85d672ce6847b06ab4d2690d2bb22b3ec95ec41d5fc: Status 404 returned error can't find the container with id fdeb53ceb1adddee96e8b85d672ce6847b06ab4d2690d2bb22b3ec95ec41d5fc Dec 09 05:12:35 crc kubenswrapper[4766]: I1209 05:12:35.994585 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:12:36 crc kubenswrapper[4766]: I1209 05:12:36.218858 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" event={"ID":"ac75c2a4-56e0-482c-94f6-919a66db4f5e","Type":"ContainerStarted","Data":"fdeb53ceb1adddee96e8b85d672ce6847b06ab4d2690d2bb22b3ec95ec41d5fc"} Dec 09 05:12:37 crc kubenswrapper[4766]: I1209 05:12:37.233075 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" event={"ID":"ac75c2a4-56e0-482c-94f6-919a66db4f5e","Type":"ContainerStarted","Data":"3accbdf8b14f6955fad202f4bb75957650b39694ceb9a57e2f448b12cfb18059"} Dec 09 05:12:37 crc kubenswrapper[4766]: I1209 05:12:37.260824 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" podStartSLOduration=2.07539423 podStartE2EDuration="2.260801302s" podCreationTimestamp="2025-12-09 05:12:35 +0000 UTC" firstStartedPulling="2025-12-09 05:12:35.994332137 +0000 UTC m=+7237.703637563" lastFinishedPulling="2025-12-09 05:12:36.179739209 +0000 UTC m=+7237.889044635" observedRunningTime="2025-12-09 05:12:37.254296816 +0000 UTC m=+7238.963602292" watchObservedRunningTime="2025-12-09 05:12:37.260801302 +0000 UTC m=+7238.970106728" Dec 09 05:12:39 crc kubenswrapper[4766]: I1209 05:12:39.840172 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:12:39 crc kubenswrapper[4766]: E1209 05:12:39.840843 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:12:52 crc kubenswrapper[4766]: I1209 05:12:52.839451 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:12:52 crc kubenswrapper[4766]: E1209 05:12:52.840424 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:13:04 crc kubenswrapper[4766]: I1209 05:13:04.839288 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:13:04 crc kubenswrapper[4766]: E1209 05:13:04.840547 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:13:19 crc kubenswrapper[4766]: I1209 05:13:19.839646 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:13:19 crc kubenswrapper[4766]: E1209 05:13:19.840460 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:13:31 crc kubenswrapper[4766]: I1209 05:13:31.839926 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:13:31 crc kubenswrapper[4766]: E1209 05:13:31.840713 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:13:46 crc kubenswrapper[4766]: I1209 05:13:46.840686 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:13:46 crc kubenswrapper[4766]: E1209 05:13:46.841690 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:14:01 crc kubenswrapper[4766]: I1209 05:14:01.845999 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:14:01 crc kubenswrapper[4766]: E1209 05:14:01.847231 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:14:13 crc kubenswrapper[4766]: I1209 05:14:13.840183 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:14:14 crc kubenswrapper[4766]: I1209 05:14:14.318893 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"ae849d50cccd9d90d9a24c69edbf389e2f39e47b7bce3e5ed5b3695b8a3514ef"} Dec 09 05:14:29 crc kubenswrapper[4766]: I1209 05:14:29.788883 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h25zc"] Dec 09 05:14:29 crc kubenswrapper[4766]: I1209 05:14:29.798629 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:29 crc kubenswrapper[4766]: I1209 05:14:29.851461 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h25zc"] Dec 09 05:14:29 crc kubenswrapper[4766]: I1209 05:14:29.939110 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z644j\" (UniqueName: \"kubernetes.io/projected/7f6143a1-ac00-481a-8c1d-fa21c11601f6-kube-api-access-z644j\") pod \"redhat-marketplace-h25zc\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:29 crc kubenswrapper[4766]: I1209 05:14:29.939398 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-utilities\") pod \"redhat-marketplace-h25zc\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:29 crc kubenswrapper[4766]: I1209 05:14:29.939606 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-catalog-content\") pod \"redhat-marketplace-h25zc\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:30 crc kubenswrapper[4766]: I1209 05:14:30.042381 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z644j\" (UniqueName: \"kubernetes.io/projected/7f6143a1-ac00-481a-8c1d-fa21c11601f6-kube-api-access-z644j\") pod \"redhat-marketplace-h25zc\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:30 crc kubenswrapper[4766]: I1209 05:14:30.042529 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-utilities\") pod \"redhat-marketplace-h25zc\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:30 crc kubenswrapper[4766]: I1209 05:14:30.042584 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-catalog-content\") pod \"redhat-marketplace-h25zc\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:30 crc kubenswrapper[4766]: I1209 05:14:30.043049 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-utilities\") pod \"redhat-marketplace-h25zc\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:30 crc kubenswrapper[4766]: I1209 05:14:30.043123 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-catalog-content\") pod \"redhat-marketplace-h25zc\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:30 crc kubenswrapper[4766]: I1209 05:14:30.063065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z644j\" (UniqueName: \"kubernetes.io/projected/7f6143a1-ac00-481a-8c1d-fa21c11601f6-kube-api-access-z644j\") pod \"redhat-marketplace-h25zc\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:30 crc kubenswrapper[4766]: I1209 05:14:30.157664 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:30 crc kubenswrapper[4766]: I1209 05:14:30.625979 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h25zc"] Dec 09 05:14:31 crc kubenswrapper[4766]: I1209 05:14:31.531502 4766 generic.go:334] "Generic (PLEG): container finished" podID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerID="2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1" exitCode=0 Dec 09 05:14:31 crc kubenswrapper[4766]: I1209 05:14:31.531928 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h25zc" event={"ID":"7f6143a1-ac00-481a-8c1d-fa21c11601f6","Type":"ContainerDied","Data":"2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1"} Dec 09 05:14:31 crc kubenswrapper[4766]: I1209 05:14:31.531980 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h25zc" event={"ID":"7f6143a1-ac00-481a-8c1d-fa21c11601f6","Type":"ContainerStarted","Data":"fc764eb42a88b31c689b5920bae3f256ba606a19b1779fe4ac227da67da5d257"} Dec 09 05:14:32 crc kubenswrapper[4766]: I1209 05:14:32.546795 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h25zc" event={"ID":"7f6143a1-ac00-481a-8c1d-fa21c11601f6","Type":"ContainerStarted","Data":"9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965"} Dec 09 05:14:33 crc kubenswrapper[4766]: I1209 05:14:33.556621 4766 generic.go:334] "Generic (PLEG): container finished" podID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerID="9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965" exitCode=0 Dec 09 05:14:33 crc kubenswrapper[4766]: I1209 05:14:33.556705 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h25zc" event={"ID":"7f6143a1-ac00-481a-8c1d-fa21c11601f6","Type":"ContainerDied","Data":"9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965"} Dec 09 05:14:34 crc kubenswrapper[4766]: I1209 05:14:34.568490 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h25zc" event={"ID":"7f6143a1-ac00-481a-8c1d-fa21c11601f6","Type":"ContainerStarted","Data":"388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c"} Dec 09 05:14:34 crc kubenswrapper[4766]: I1209 05:14:34.590074 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h25zc" podStartSLOduration=3.175286799 podStartE2EDuration="5.590047167s" podCreationTimestamp="2025-12-09 05:14:29 +0000 UTC" firstStartedPulling="2025-12-09 05:14:31.536343347 +0000 UTC m=+7353.245648813" lastFinishedPulling="2025-12-09 05:14:33.951103755 +0000 UTC m=+7355.660409181" observedRunningTime="2025-12-09 05:14:34.586785078 +0000 UTC m=+7356.296090514" watchObservedRunningTime="2025-12-09 05:14:34.590047167 +0000 UTC m=+7356.299352593" Dec 09 05:14:40 crc kubenswrapper[4766]: I1209 05:14:40.158055 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:40 crc kubenswrapper[4766]: I1209 05:14:40.160352 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:40 crc kubenswrapper[4766]: I1209 05:14:40.254037 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:40 crc kubenswrapper[4766]: I1209 05:14:40.703534 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:40 crc kubenswrapper[4766]: I1209 05:14:40.774628 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h25zc"] Dec 09 05:14:42 crc kubenswrapper[4766]: I1209 05:14:42.680402 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h25zc" podUID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerName="registry-server" containerID="cri-o://388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c" gracePeriod=2 Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.235529 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.403410 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z644j\" (UniqueName: \"kubernetes.io/projected/7f6143a1-ac00-481a-8c1d-fa21c11601f6-kube-api-access-z644j\") pod \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.403480 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-catalog-content\") pod \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.403714 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-utilities\") pod \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\" (UID: \"7f6143a1-ac00-481a-8c1d-fa21c11601f6\") " Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.404442 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-utilities" (OuterVolumeSpecName: "utilities") pod "7f6143a1-ac00-481a-8c1d-fa21c11601f6" (UID: "7f6143a1-ac00-481a-8c1d-fa21c11601f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.405677 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.409328 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6143a1-ac00-481a-8c1d-fa21c11601f6-kube-api-access-z644j" (OuterVolumeSpecName: "kube-api-access-z644j") pod "7f6143a1-ac00-481a-8c1d-fa21c11601f6" (UID: "7f6143a1-ac00-481a-8c1d-fa21c11601f6"). InnerVolumeSpecName "kube-api-access-z644j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.420727 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f6143a1-ac00-481a-8c1d-fa21c11601f6" (UID: "7f6143a1-ac00-481a-8c1d-fa21c11601f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.507111 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z644j\" (UniqueName: \"kubernetes.io/projected/7f6143a1-ac00-481a-8c1d-fa21c11601f6-kube-api-access-z644j\") on node \"crc\" DevicePath \"\"" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.507155 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6143a1-ac00-481a-8c1d-fa21c11601f6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.691196 4766 generic.go:334] "Generic (PLEG): container finished" podID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerID="388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c" exitCode=0 Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.691248 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h25zc" event={"ID":"7f6143a1-ac00-481a-8c1d-fa21c11601f6","Type":"ContainerDied","Data":"388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c"} Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.691266 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h25zc" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.691282 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h25zc" event={"ID":"7f6143a1-ac00-481a-8c1d-fa21c11601f6","Type":"ContainerDied","Data":"fc764eb42a88b31c689b5920bae3f256ba606a19b1779fe4ac227da67da5d257"} Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.691304 4766 scope.go:117] "RemoveContainer" containerID="388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.713133 4766 scope.go:117] "RemoveContainer" containerID="9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.730494 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h25zc"] Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.740532 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h25zc"] Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.762964 4766 scope.go:117] "RemoveContainer" containerID="2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.801855 4766 scope.go:117] "RemoveContainer" containerID="388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c" Dec 09 05:14:43 crc kubenswrapper[4766]: E1209 05:14:43.802441 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c\": container with ID starting with 388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c not found: ID does not exist" containerID="388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.802470 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c"} err="failed to get container status \"388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c\": rpc error: code = NotFound desc = could not find container \"388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c\": container with ID starting with 388d853ec64d9119e031f7e704389217222c108161efc3003dc672b5e2a4930c not found: ID does not exist" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.802492 4766 scope.go:117] "RemoveContainer" containerID="9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965" Dec 09 05:14:43 crc kubenswrapper[4766]: E1209 05:14:43.803208 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965\": container with ID starting with 9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965 not found: ID does not exist" containerID="9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.803288 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965"} err="failed to get container status \"9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965\": rpc error: code = NotFound desc = could not find container \"9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965\": container with ID starting with 9ee7ef961dc7443fb3e570444aa9c4912998fb22e0067045dce67e4e3d4ae965 not found: ID does not exist" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.803306 4766 scope.go:117] "RemoveContainer" containerID="2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1" Dec 09 05:14:43 crc kubenswrapper[4766]: E1209 05:14:43.803678 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1\": container with ID starting with 2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1 not found: ID does not exist" containerID="2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1" Dec 09 05:14:43 crc kubenswrapper[4766]: I1209 05:14:43.803706 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1"} err="failed to get container status \"2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1\": rpc error: code = NotFound desc = could not find container \"2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1\": container with ID starting with 2f78baf59b143a431cfe5c7673075ea14199243798bc3b150eb03f74558050e1 not found: ID does not exist" Dec 09 05:14:44 crc kubenswrapper[4766]: I1209 05:14:44.853516 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" path="/var/lib/kubelet/pods/7f6143a1-ac00-481a-8c1d-fa21c11601f6/volumes" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.179952 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj"] Dec 09 05:15:00 crc kubenswrapper[4766]: E1209 05:15:00.180975 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerName="extract-utilities" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.180998 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerName="extract-utilities" Dec 09 05:15:00 crc kubenswrapper[4766]: E1209 05:15:00.181025 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerName="registry-server" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.181036 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerName="registry-server" Dec 09 05:15:00 crc kubenswrapper[4766]: E1209 05:15:00.181103 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerName="extract-content" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.181116 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerName="extract-content" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.181460 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6143a1-ac00-481a-8c1d-fa21c11601f6" containerName="registry-server" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.182342 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.184643 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.184665 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.190614 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj"] Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.299339 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66qbz\" (UniqueName: \"kubernetes.io/projected/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-kube-api-access-66qbz\") pod \"collect-profiles-29420955-7cghj\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.299475 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-config-volume\") pod \"collect-profiles-29420955-7cghj\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.299551 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-secret-volume\") pod \"collect-profiles-29420955-7cghj\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.401951 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66qbz\" (UniqueName: \"kubernetes.io/projected/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-kube-api-access-66qbz\") pod \"collect-profiles-29420955-7cghj\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.402022 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-config-volume\") pod \"collect-profiles-29420955-7cghj\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.402093 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-secret-volume\") pod \"collect-profiles-29420955-7cghj\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.403192 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-config-volume\") pod \"collect-profiles-29420955-7cghj\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.411976 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-secret-volume\") pod \"collect-profiles-29420955-7cghj\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.424978 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66qbz\" (UniqueName: \"kubernetes.io/projected/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-kube-api-access-66qbz\") pod \"collect-profiles-29420955-7cghj\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:00 crc kubenswrapper[4766]: I1209 05:15:00.521174 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:01 crc kubenswrapper[4766]: I1209 05:15:01.026065 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj"] Dec 09 05:15:01 crc kubenswrapper[4766]: I1209 05:15:01.886286 4766 generic.go:334] "Generic (PLEG): container finished" podID="011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2" containerID="0cf9e10bf47c7edcf552b6625f28e358f12bc7ca0859a85457e77de8d63b146a" exitCode=0 Dec 09 05:15:01 crc kubenswrapper[4766]: I1209 05:15:01.886519 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" event={"ID":"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2","Type":"ContainerDied","Data":"0cf9e10bf47c7edcf552b6625f28e358f12bc7ca0859a85457e77de8d63b146a"} Dec 09 05:15:01 crc kubenswrapper[4766]: I1209 05:15:01.886627 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" event={"ID":"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2","Type":"ContainerStarted","Data":"682f066d20d680c70d000b1c16cb73b10da30af1a6fe2268277f47342bd92be3"} Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.292586 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.360799 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-secret-volume\") pod \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.360981 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66qbz\" (UniqueName: \"kubernetes.io/projected/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-kube-api-access-66qbz\") pod \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.361027 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-config-volume\") pod \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\" (UID: \"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2\") " Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.362881 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-config-volume" (OuterVolumeSpecName: "config-volume") pod "011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2" (UID: "011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.369302 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2" (UID: "011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.371194 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-kube-api-access-66qbz" (OuterVolumeSpecName: "kube-api-access-66qbz") pod "011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2" (UID: "011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2"). InnerVolumeSpecName "kube-api-access-66qbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.470836 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.470872 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66qbz\" (UniqueName: \"kubernetes.io/projected/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-kube-api-access-66qbz\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.470883 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.912611 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" event={"ID":"011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2","Type":"ContainerDied","Data":"682f066d20d680c70d000b1c16cb73b10da30af1a6fe2268277f47342bd92be3"} Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.913050 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="682f066d20d680c70d000b1c16cb73b10da30af1a6fe2268277f47342bd92be3" Dec 09 05:15:03 crc kubenswrapper[4766]: I1209 05:15:03.912677 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj" Dec 09 05:15:04 crc kubenswrapper[4766]: I1209 05:15:04.370443 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp"] Dec 09 05:15:04 crc kubenswrapper[4766]: I1209 05:15:04.381596 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420910-qn6pp"] Dec 09 05:15:04 crc kubenswrapper[4766]: I1209 05:15:04.875406 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d6bf2ea-ab41-4d7c-915c-173cccd5043a" path="/var/lib/kubelet/pods/7d6bf2ea-ab41-4d7c-915c-173cccd5043a/volumes" Dec 09 05:15:26 crc kubenswrapper[4766]: I1209 05:15:26.440615 4766 scope.go:117] "RemoveContainer" containerID="2081b0b7b3b77dfd02bcb3b144ddc1644d494418c760cd67ab818e150e85083b" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.767989 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hlq46"] Dec 09 05:15:45 crc kubenswrapper[4766]: E1209 05:15:45.771930 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2" containerName="collect-profiles" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.771956 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2" containerName="collect-profiles" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.772206 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2" containerName="collect-profiles" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.774231 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.784449 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hlq46"] Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.830617 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-utilities\") pod \"certified-operators-hlq46\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.830696 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-catalog-content\") pod \"certified-operators-hlq46\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.830821 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxllc\" (UniqueName: \"kubernetes.io/projected/b79a3ec9-067f-445b-ae1c-6f0a465937c6-kube-api-access-lxllc\") pod \"certified-operators-hlq46\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.933018 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-utilities\") pod \"certified-operators-hlq46\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.933380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-catalog-content\") pod \"certified-operators-hlq46\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.933826 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-utilities\") pod \"certified-operators-hlq46\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.933843 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-catalog-content\") pod \"certified-operators-hlq46\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.935285 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxllc\" (UniqueName: \"kubernetes.io/projected/b79a3ec9-067f-445b-ae1c-6f0a465937c6-kube-api-access-lxllc\") pod \"certified-operators-hlq46\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:45 crc kubenswrapper[4766]: I1209 05:15:45.957441 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxllc\" (UniqueName: \"kubernetes.io/projected/b79a3ec9-067f-445b-ae1c-6f0a465937c6-kube-api-access-lxllc\") pod \"certified-operators-hlq46\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:46 crc kubenswrapper[4766]: I1209 05:15:46.144781 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:46 crc kubenswrapper[4766]: I1209 05:15:46.700548 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hlq46"] Dec 09 05:15:47 crc kubenswrapper[4766]: I1209 05:15:47.394054 4766 generic.go:334] "Generic (PLEG): container finished" podID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerID="bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e" exitCode=0 Dec 09 05:15:47 crc kubenswrapper[4766]: I1209 05:15:47.394093 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlq46" event={"ID":"b79a3ec9-067f-445b-ae1c-6f0a465937c6","Type":"ContainerDied","Data":"bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e"} Dec 09 05:15:47 crc kubenswrapper[4766]: I1209 05:15:47.394503 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlq46" event={"ID":"b79a3ec9-067f-445b-ae1c-6f0a465937c6","Type":"ContainerStarted","Data":"073ef790baf41b3a2546a86c2d05b00b644debb4f015eb83bd8fd7ad60d024e1"} Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.406618 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlq46" event={"ID":"b79a3ec9-067f-445b-ae1c-6f0a465937c6","Type":"ContainerStarted","Data":"13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561"} Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.745659 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qrcrj"] Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.749923 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.763590 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrcrj"] Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.855921 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-utilities\") pod \"redhat-operators-qrcrj\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.856273 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqskh\" (UniqueName: \"kubernetes.io/projected/d888f325-3ddd-4cb5-850b-e7ac8fee0489-kube-api-access-sqskh\") pod \"redhat-operators-qrcrj\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.856419 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-catalog-content\") pod \"redhat-operators-qrcrj\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.958426 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqskh\" (UniqueName: \"kubernetes.io/projected/d888f325-3ddd-4cb5-850b-e7ac8fee0489-kube-api-access-sqskh\") pod \"redhat-operators-qrcrj\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.958582 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-catalog-content\") pod \"redhat-operators-qrcrj\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.958833 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-utilities\") pod \"redhat-operators-qrcrj\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.959011 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-catalog-content\") pod \"redhat-operators-qrcrj\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.959263 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-utilities\") pod \"redhat-operators-qrcrj\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:48 crc kubenswrapper[4766]: I1209 05:15:48.988194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqskh\" (UniqueName: \"kubernetes.io/projected/d888f325-3ddd-4cb5-850b-e7ac8fee0489-kube-api-access-sqskh\") pod \"redhat-operators-qrcrj\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:49 crc kubenswrapper[4766]: I1209 05:15:49.080681 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:49 crc kubenswrapper[4766]: I1209 05:15:49.423639 4766 generic.go:334] "Generic (PLEG): container finished" podID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerID="13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561" exitCode=0 Dec 09 05:15:49 crc kubenswrapper[4766]: I1209 05:15:49.423827 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlq46" event={"ID":"b79a3ec9-067f-445b-ae1c-6f0a465937c6","Type":"ContainerDied","Data":"13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561"} Dec 09 05:15:49 crc kubenswrapper[4766]: I1209 05:15:49.647448 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrcrj"] Dec 09 05:15:49 crc kubenswrapper[4766]: W1209 05:15:49.651331 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd888f325_3ddd_4cb5_850b_e7ac8fee0489.slice/crio-a396edd2b96d29d53bc064aef675d1740b690f593e2cb64f184ffde55a38c9a5 WatchSource:0}: Error finding container a396edd2b96d29d53bc064aef675d1740b690f593e2cb64f184ffde55a38c9a5: Status 404 returned error can't find the container with id a396edd2b96d29d53bc064aef675d1740b690f593e2cb64f184ffde55a38c9a5 Dec 09 05:15:50 crc kubenswrapper[4766]: I1209 05:15:50.433561 4766 generic.go:334] "Generic (PLEG): container finished" podID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerID="f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193" exitCode=0 Dec 09 05:15:50 crc kubenswrapper[4766]: I1209 05:15:50.433905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrcrj" event={"ID":"d888f325-3ddd-4cb5-850b-e7ac8fee0489","Type":"ContainerDied","Data":"f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193"} Dec 09 05:15:50 crc kubenswrapper[4766]: I1209 05:15:50.433934 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrcrj" event={"ID":"d888f325-3ddd-4cb5-850b-e7ac8fee0489","Type":"ContainerStarted","Data":"a396edd2b96d29d53bc064aef675d1740b690f593e2cb64f184ffde55a38c9a5"} Dec 09 05:15:51 crc kubenswrapper[4766]: I1209 05:15:51.446534 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrcrj" event={"ID":"d888f325-3ddd-4cb5-850b-e7ac8fee0489","Type":"ContainerStarted","Data":"ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d"} Dec 09 05:15:51 crc kubenswrapper[4766]: I1209 05:15:51.449943 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlq46" event={"ID":"b79a3ec9-067f-445b-ae1c-6f0a465937c6","Type":"ContainerStarted","Data":"c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479"} Dec 09 05:15:51 crc kubenswrapper[4766]: I1209 05:15:51.488500 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hlq46" podStartSLOduration=4.05092291 podStartE2EDuration="6.488477246s" podCreationTimestamp="2025-12-09 05:15:45 +0000 UTC" firstStartedPulling="2025-12-09 05:15:47.39608123 +0000 UTC m=+7429.105386676" lastFinishedPulling="2025-12-09 05:15:49.833635586 +0000 UTC m=+7431.542941012" observedRunningTime="2025-12-09 05:15:51.479355789 +0000 UTC m=+7433.188661225" watchObservedRunningTime="2025-12-09 05:15:51.488477246 +0000 UTC m=+7433.197782692" Dec 09 05:15:52 crc kubenswrapper[4766]: I1209 05:15:52.460425 4766 generic.go:334] "Generic (PLEG): container finished" podID="ac75c2a4-56e0-482c-94f6-919a66db4f5e" containerID="3accbdf8b14f6955fad202f4bb75957650b39694ceb9a57e2f448b12cfb18059" exitCode=0 Dec 09 05:15:52 crc kubenswrapper[4766]: I1209 05:15:52.460478 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" event={"ID":"ac75c2a4-56e0-482c-94f6-919a66db4f5e","Type":"ContainerDied","Data":"3accbdf8b14f6955fad202f4bb75957650b39694ceb9a57e2f448b12cfb18059"} Dec 09 05:15:53 crc kubenswrapper[4766]: I1209 05:15:53.971626 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.113279 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7wgz\" (UniqueName: \"kubernetes.io/projected/ac75c2a4-56e0-482c-94f6-919a66db4f5e-kube-api-access-d7wgz\") pod \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.113313 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ceph\") pod \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.113381 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ssh-key\") pod \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.113397 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-bootstrap-combined-ca-bundle\") pod \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.113428 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-inventory\") pod \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\" (UID: \"ac75c2a4-56e0-482c-94f6-919a66db4f5e\") " Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.119580 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ac75c2a4-56e0-482c-94f6-919a66db4f5e" (UID: "ac75c2a4-56e0-482c-94f6-919a66db4f5e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.119611 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ceph" (OuterVolumeSpecName: "ceph") pod "ac75c2a4-56e0-482c-94f6-919a66db4f5e" (UID: "ac75c2a4-56e0-482c-94f6-919a66db4f5e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.126602 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac75c2a4-56e0-482c-94f6-919a66db4f5e-kube-api-access-d7wgz" (OuterVolumeSpecName: "kube-api-access-d7wgz") pod "ac75c2a4-56e0-482c-94f6-919a66db4f5e" (UID: "ac75c2a4-56e0-482c-94f6-919a66db4f5e"). InnerVolumeSpecName "kube-api-access-d7wgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.146147 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-inventory" (OuterVolumeSpecName: "inventory") pod "ac75c2a4-56e0-482c-94f6-919a66db4f5e" (UID: "ac75c2a4-56e0-482c-94f6-919a66db4f5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.148482 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ac75c2a4-56e0-482c-94f6-919a66db4f5e" (UID: "ac75c2a4-56e0-482c-94f6-919a66db4f5e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.216839 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7wgz\" (UniqueName: \"kubernetes.io/projected/ac75c2a4-56e0-482c-94f6-919a66db4f5e-kube-api-access-d7wgz\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.217035 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.217093 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.217147 4766 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.217208 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac75c2a4-56e0-482c-94f6-919a66db4f5e-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.491175 4766 generic.go:334] "Generic (PLEG): container finished" podID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerID="ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d" exitCode=0 Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.491320 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrcrj" event={"ID":"d888f325-3ddd-4cb5-850b-e7ac8fee0489","Type":"ContainerDied","Data":"ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d"} Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.503206 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" event={"ID":"ac75c2a4-56e0-482c-94f6-919a66db4f5e","Type":"ContainerDied","Data":"fdeb53ceb1adddee96e8b85d672ce6847b06ab4d2690d2bb22b3ec95ec41d5fc"} Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.503272 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdeb53ceb1adddee96e8b85d672ce6847b06ab4d2690d2bb22b3ec95ec41d5fc" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.503325 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-7nrh9" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.569028 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-nmbq8"] Dec 09 05:15:54 crc kubenswrapper[4766]: E1209 05:15:54.570131 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac75c2a4-56e0-482c-94f6-919a66db4f5e" containerName="bootstrap-openstack-openstack-cell1" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.571855 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac75c2a4-56e0-482c-94f6-919a66db4f5e" containerName="bootstrap-openstack-openstack-cell1" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.572317 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac75c2a4-56e0-482c-94f6-919a66db4f5e" containerName="bootstrap-openstack-openstack-cell1" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.573621 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.576251 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.576362 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.578848 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.590590 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-nmbq8"] Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.595995 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.726268 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp82w\" (UniqueName: \"kubernetes.io/projected/03e6ab22-c008-415b-8d9d-08ca8d4b3379-kube-api-access-mp82w\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.726328 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ssh-key\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.726585 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ceph\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.726673 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-inventory\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.829395 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp82w\" (UniqueName: \"kubernetes.io/projected/03e6ab22-c008-415b-8d9d-08ca8d4b3379-kube-api-access-mp82w\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.829486 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ssh-key\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.829532 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ceph\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.829560 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-inventory\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.835432 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-inventory\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.835525 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ceph\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.839717 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ssh-key\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.849531 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp82w\" (UniqueName: \"kubernetes.io/projected/03e6ab22-c008-415b-8d9d-08ca8d4b3379-kube-api-access-mp82w\") pod \"download-cache-openstack-openstack-cell1-nmbq8\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:54 crc kubenswrapper[4766]: I1209 05:15:54.932242 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:15:55 crc kubenswrapper[4766]: I1209 05:15:55.512940 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-nmbq8"] Dec 09 05:15:55 crc kubenswrapper[4766]: I1209 05:15:55.531139 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrcrj" event={"ID":"d888f325-3ddd-4cb5-850b-e7ac8fee0489","Type":"ContainerStarted","Data":"0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a"} Dec 09 05:15:55 crc kubenswrapper[4766]: I1209 05:15:55.558526 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qrcrj" podStartSLOduration=3.066764338 podStartE2EDuration="7.558503196s" podCreationTimestamp="2025-12-09 05:15:48 +0000 UTC" firstStartedPulling="2025-12-09 05:15:50.436321776 +0000 UTC m=+7432.145627202" lastFinishedPulling="2025-12-09 05:15:54.928060634 +0000 UTC m=+7436.637366060" observedRunningTime="2025-12-09 05:15:55.55496412 +0000 UTC m=+7437.264269546" watchObservedRunningTime="2025-12-09 05:15:55.558503196 +0000 UTC m=+7437.267808622" Dec 09 05:15:56 crc kubenswrapper[4766]: I1209 05:15:56.145141 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:56 crc kubenswrapper[4766]: I1209 05:15:56.145588 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:56 crc kubenswrapper[4766]: I1209 05:15:56.218134 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:56 crc kubenswrapper[4766]: I1209 05:15:56.547117 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" event={"ID":"03e6ab22-c008-415b-8d9d-08ca8d4b3379","Type":"ContainerStarted","Data":"8d2cb9acb84dc233efb22101d6542031ad13a736c100fbaeae0f5edd25c502ce"} Dec 09 05:15:56 crc kubenswrapper[4766]: I1209 05:15:56.547545 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" event={"ID":"03e6ab22-c008-415b-8d9d-08ca8d4b3379","Type":"ContainerStarted","Data":"6bcef1ec111705b8edb09db3da2abe076ec028328979e2ad4f4dce875ee7ef39"} Dec 09 05:15:56 crc kubenswrapper[4766]: I1209 05:15:56.574476 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" podStartSLOduration=2.25884422 podStartE2EDuration="2.574455516s" podCreationTimestamp="2025-12-09 05:15:54 +0000 UTC" firstStartedPulling="2025-12-09 05:15:55.529271824 +0000 UTC m=+7437.238577250" lastFinishedPulling="2025-12-09 05:15:55.84488311 +0000 UTC m=+7437.554188546" observedRunningTime="2025-12-09 05:15:56.568086983 +0000 UTC m=+7438.277392429" watchObservedRunningTime="2025-12-09 05:15:56.574455516 +0000 UTC m=+7438.283760952" Dec 09 05:15:56 crc kubenswrapper[4766]: I1209 05:15:56.612243 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:58 crc kubenswrapper[4766]: I1209 05:15:58.353919 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hlq46"] Dec 09 05:15:58 crc kubenswrapper[4766]: I1209 05:15:58.563032 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hlq46" podUID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerName="registry-server" containerID="cri-o://c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479" gracePeriod=2 Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.079561 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.080792 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.080912 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.243009 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-utilities\") pod \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.243285 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxllc\" (UniqueName: \"kubernetes.io/projected/b79a3ec9-067f-445b-ae1c-6f0a465937c6-kube-api-access-lxllc\") pod \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.243359 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-catalog-content\") pod \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\" (UID: \"b79a3ec9-067f-445b-ae1c-6f0a465937c6\") " Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.243717 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-utilities" (OuterVolumeSpecName: "utilities") pod "b79a3ec9-067f-445b-ae1c-6f0a465937c6" (UID: "b79a3ec9-067f-445b-ae1c-6f0a465937c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.244025 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.249130 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79a3ec9-067f-445b-ae1c-6f0a465937c6-kube-api-access-lxllc" (OuterVolumeSpecName: "kube-api-access-lxllc") pod "b79a3ec9-067f-445b-ae1c-6f0a465937c6" (UID: "b79a3ec9-067f-445b-ae1c-6f0a465937c6"). InnerVolumeSpecName "kube-api-access-lxllc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.288456 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b79a3ec9-067f-445b-ae1c-6f0a465937c6" (UID: "b79a3ec9-067f-445b-ae1c-6f0a465937c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.346115 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxllc\" (UniqueName: \"kubernetes.io/projected/b79a3ec9-067f-445b-ae1c-6f0a465937c6-kube-api-access-lxllc\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.346170 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b79a3ec9-067f-445b-ae1c-6f0a465937c6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.574003 4766 generic.go:334] "Generic (PLEG): container finished" podID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerID="c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479" exitCode=0 Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.574108 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlq46" event={"ID":"b79a3ec9-067f-445b-ae1c-6f0a465937c6","Type":"ContainerDied","Data":"c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479"} Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.574168 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlq46" event={"ID":"b79a3ec9-067f-445b-ae1c-6f0a465937c6","Type":"ContainerDied","Data":"073ef790baf41b3a2546a86c2d05b00b644debb4f015eb83bd8fd7ad60d024e1"} Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.574198 4766 scope.go:117] "RemoveContainer" containerID="c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.575172 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlq46" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.596610 4766 scope.go:117] "RemoveContainer" containerID="13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.613409 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hlq46"] Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.622798 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hlq46"] Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.632092 4766 scope.go:117] "RemoveContainer" containerID="bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.672954 4766 scope.go:117] "RemoveContainer" containerID="c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479" Dec 09 05:15:59 crc kubenswrapper[4766]: E1209 05:15:59.673555 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479\": container with ID starting with c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479 not found: ID does not exist" containerID="c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.673588 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479"} err="failed to get container status \"c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479\": rpc error: code = NotFound desc = could not find container \"c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479\": container with ID starting with c4b0208ddf8ed101e0d753df2a689d5067ed02e08c84c259ae3592442fa10479 not found: ID does not exist" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.673623 4766 scope.go:117] "RemoveContainer" containerID="13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561" Dec 09 05:15:59 crc kubenswrapper[4766]: E1209 05:15:59.673886 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561\": container with ID starting with 13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561 not found: ID does not exist" containerID="13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.673936 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561"} err="failed to get container status \"13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561\": rpc error: code = NotFound desc = could not find container \"13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561\": container with ID starting with 13df930f57db432b96906c39f5e88e5b157dc16ec00f1a560eb52a8cbbaba561 not found: ID does not exist" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.673982 4766 scope.go:117] "RemoveContainer" containerID="bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e" Dec 09 05:15:59 crc kubenswrapper[4766]: E1209 05:15:59.674329 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e\": container with ID starting with bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e not found: ID does not exist" containerID="bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e" Dec 09 05:15:59 crc kubenswrapper[4766]: I1209 05:15:59.674408 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e"} err="failed to get container status \"bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e\": rpc error: code = NotFound desc = could not find container \"bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e\": container with ID starting with bf20a35b19c974c73e33db396b3805c9404a70cd6b3e766a8eb56854a6a9568e not found: ID does not exist" Dec 09 05:16:00 crc kubenswrapper[4766]: I1209 05:16:00.150551 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qrcrj" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerName="registry-server" probeResult="failure" output=< Dec 09 05:16:00 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:16:00 crc kubenswrapper[4766]: > Dec 09 05:16:00 crc kubenswrapper[4766]: I1209 05:16:00.852403 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" path="/var/lib/kubelet/pods/b79a3ec9-067f-445b-ae1c-6f0a465937c6/volumes" Dec 09 05:16:09 crc kubenswrapper[4766]: I1209 05:16:09.127772 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:16:09 crc kubenswrapper[4766]: I1209 05:16:09.230282 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:16:09 crc kubenswrapper[4766]: I1209 05:16:09.362292 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrcrj"] Dec 09 05:16:10 crc kubenswrapper[4766]: I1209 05:16:10.705855 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qrcrj" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerName="registry-server" containerID="cri-o://0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a" gracePeriod=2 Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.202797 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.343360 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-catalog-content\") pod \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.344433 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqskh\" (UniqueName: \"kubernetes.io/projected/d888f325-3ddd-4cb5-850b-e7ac8fee0489-kube-api-access-sqskh\") pod \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.344459 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-utilities\") pod \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\" (UID: \"d888f325-3ddd-4cb5-850b-e7ac8fee0489\") " Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.345385 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-utilities" (OuterVolumeSpecName: "utilities") pod "d888f325-3ddd-4cb5-850b-e7ac8fee0489" (UID: "d888f325-3ddd-4cb5-850b-e7ac8fee0489"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.345476 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.353517 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d888f325-3ddd-4cb5-850b-e7ac8fee0489-kube-api-access-sqskh" (OuterVolumeSpecName: "kube-api-access-sqskh") pod "d888f325-3ddd-4cb5-850b-e7ac8fee0489" (UID: "d888f325-3ddd-4cb5-850b-e7ac8fee0489"). InnerVolumeSpecName "kube-api-access-sqskh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.449318 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqskh\" (UniqueName: \"kubernetes.io/projected/d888f325-3ddd-4cb5-850b-e7ac8fee0489-kube-api-access-sqskh\") on node \"crc\" DevicePath \"\"" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.466495 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d888f325-3ddd-4cb5-850b-e7ac8fee0489" (UID: "d888f325-3ddd-4cb5-850b-e7ac8fee0489"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.551540 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d888f325-3ddd-4cb5-850b-e7ac8fee0489-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.726117 4766 generic.go:334] "Generic (PLEG): container finished" podID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerID="0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a" exitCode=0 Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.726244 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrcrj" event={"ID":"d888f325-3ddd-4cb5-850b-e7ac8fee0489","Type":"ContainerDied","Data":"0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a"} Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.726468 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrcrj" event={"ID":"d888f325-3ddd-4cb5-850b-e7ac8fee0489","Type":"ContainerDied","Data":"a396edd2b96d29d53bc064aef675d1740b690f593e2cb64f184ffde55a38c9a5"} Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.726501 4766 scope.go:117] "RemoveContainer" containerID="0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.726286 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrcrj" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.764500 4766 scope.go:117] "RemoveContainer" containerID="ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.797808 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrcrj"] Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.808903 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qrcrj"] Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.820332 4766 scope.go:117] "RemoveContainer" containerID="f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.854823 4766 scope.go:117] "RemoveContainer" containerID="0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a" Dec 09 05:16:11 crc kubenswrapper[4766]: E1209 05:16:11.855546 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a\": container with ID starting with 0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a not found: ID does not exist" containerID="0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.855608 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a"} err="failed to get container status \"0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a\": rpc error: code = NotFound desc = could not find container \"0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a\": container with ID starting with 0f61d3429b77418a005cda6f232a32f6b991f8f5b8529bcf2d81590a21a3811a not found: ID does not exist" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.855634 4766 scope.go:117] "RemoveContainer" containerID="ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d" Dec 09 05:16:11 crc kubenswrapper[4766]: E1209 05:16:11.855976 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d\": container with ID starting with ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d not found: ID does not exist" containerID="ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.856015 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d"} err="failed to get container status \"ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d\": rpc error: code = NotFound desc = could not find container \"ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d\": container with ID starting with ef23db9ca86278a5f28209bcb023daf9a8f26076f01c93c6a8763e1a6cc5b91d not found: ID does not exist" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.856046 4766 scope.go:117] "RemoveContainer" containerID="f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193" Dec 09 05:16:11 crc kubenswrapper[4766]: E1209 05:16:11.856372 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193\": container with ID starting with f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193 not found: ID does not exist" containerID="f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193" Dec 09 05:16:11 crc kubenswrapper[4766]: I1209 05:16:11.856420 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193"} err="failed to get container status \"f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193\": rpc error: code = NotFound desc = could not find container \"f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193\": container with ID starting with f7688c664433250bb35142686cb42c9663dc79c795820a0263f73a188d3fc193 not found: ID does not exist" Dec 09 05:16:12 crc kubenswrapper[4766]: I1209 05:16:12.863973 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" path="/var/lib/kubelet/pods/d888f325-3ddd-4cb5-850b-e7ac8fee0489/volumes" Dec 09 05:16:37 crc kubenswrapper[4766]: I1209 05:16:37.317061 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:16:37 crc kubenswrapper[4766]: I1209 05:16:37.317654 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:17:07 crc kubenswrapper[4766]: I1209 05:17:07.316997 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:17:07 crc kubenswrapper[4766]: I1209 05:17:07.317635 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:17:34 crc kubenswrapper[4766]: I1209 05:17:34.666761 4766 generic.go:334] "Generic (PLEG): container finished" podID="03e6ab22-c008-415b-8d9d-08ca8d4b3379" containerID="8d2cb9acb84dc233efb22101d6542031ad13a736c100fbaeae0f5edd25c502ce" exitCode=0 Dec 09 05:17:34 crc kubenswrapper[4766]: I1209 05:17:34.666860 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" event={"ID":"03e6ab22-c008-415b-8d9d-08ca8d4b3379","Type":"ContainerDied","Data":"8d2cb9acb84dc233efb22101d6542031ad13a736c100fbaeae0f5edd25c502ce"} Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.183318 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.262662 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ssh-key\") pod \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.262814 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp82w\" (UniqueName: \"kubernetes.io/projected/03e6ab22-c008-415b-8d9d-08ca8d4b3379-kube-api-access-mp82w\") pod \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.262930 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ceph\") pod \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.263013 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-inventory\") pod \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\" (UID: \"03e6ab22-c008-415b-8d9d-08ca8d4b3379\") " Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.268666 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03e6ab22-c008-415b-8d9d-08ca8d4b3379-kube-api-access-mp82w" (OuterVolumeSpecName: "kube-api-access-mp82w") pod "03e6ab22-c008-415b-8d9d-08ca8d4b3379" (UID: "03e6ab22-c008-415b-8d9d-08ca8d4b3379"). InnerVolumeSpecName "kube-api-access-mp82w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.274178 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ceph" (OuterVolumeSpecName: "ceph") pod "03e6ab22-c008-415b-8d9d-08ca8d4b3379" (UID: "03e6ab22-c008-415b-8d9d-08ca8d4b3379"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.298920 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "03e6ab22-c008-415b-8d9d-08ca8d4b3379" (UID: "03e6ab22-c008-415b-8d9d-08ca8d4b3379"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.299653 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-inventory" (OuterVolumeSpecName: "inventory") pod "03e6ab22-c008-415b-8d9d-08ca8d4b3379" (UID: "03e6ab22-c008-415b-8d9d-08ca8d4b3379"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.366097 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.366168 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.366184 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp82w\" (UniqueName: \"kubernetes.io/projected/03e6ab22-c008-415b-8d9d-08ca8d4b3379-kube-api-access-mp82w\") on node \"crc\" DevicePath \"\"" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.366198 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/03e6ab22-c008-415b-8d9d-08ca8d4b3379-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.690060 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" event={"ID":"03e6ab22-c008-415b-8d9d-08ca8d4b3379","Type":"ContainerDied","Data":"6bcef1ec111705b8edb09db3da2abe076ec028328979e2ad4f4dce875ee7ef39"} Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.690109 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bcef1ec111705b8edb09db3da2abe076ec028328979e2ad4f4dce875ee7ef39" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.690142 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-nmbq8" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.793823 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-f2kbb"] Dec 09 05:17:36 crc kubenswrapper[4766]: E1209 05:17:36.794502 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e6ab22-c008-415b-8d9d-08ca8d4b3379" containerName="download-cache-openstack-openstack-cell1" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.794528 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e6ab22-c008-415b-8d9d-08ca8d4b3379" containerName="download-cache-openstack-openstack-cell1" Dec 09 05:17:36 crc kubenswrapper[4766]: E1209 05:17:36.794547 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerName="registry-server" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.794558 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerName="registry-server" Dec 09 05:17:36 crc kubenswrapper[4766]: E1209 05:17:36.794587 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerName="extract-content" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.794596 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerName="extract-content" Dec 09 05:17:36 crc kubenswrapper[4766]: E1209 05:17:36.794616 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerName="extract-content" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.794623 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerName="extract-content" Dec 09 05:17:36 crc kubenswrapper[4766]: E1209 05:17:36.794642 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerName="registry-server" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.794650 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerName="registry-server" Dec 09 05:17:36 crc kubenswrapper[4766]: E1209 05:17:36.794673 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerName="extract-utilities" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.794683 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerName="extract-utilities" Dec 09 05:17:36 crc kubenswrapper[4766]: E1209 05:17:36.794715 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerName="extract-utilities" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.794723 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerName="extract-utilities" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.794965 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79a3ec9-067f-445b-ae1c-6f0a465937c6" containerName="registry-server" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.794999 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d888f325-3ddd-4cb5-850b-e7ac8fee0489" containerName="registry-server" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.795029 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="03e6ab22-c008-415b-8d9d-08ca8d4b3379" containerName="download-cache-openstack-openstack-cell1" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.795983 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.799247 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.799326 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.799520 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.800245 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.807938 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-f2kbb"] Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.875337 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-inventory\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.875576 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ssh-key\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.875627 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ceph\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.875768 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slhsk\" (UniqueName: \"kubernetes.io/projected/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-kube-api-access-slhsk\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.977945 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ssh-key\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.978003 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ceph\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.978081 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slhsk\" (UniqueName: \"kubernetes.io/projected/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-kube-api-access-slhsk\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.978182 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-inventory\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.982865 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ceph\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.983143 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ssh-key\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.983231 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-inventory\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:36 crc kubenswrapper[4766]: I1209 05:17:36.997624 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slhsk\" (UniqueName: \"kubernetes.io/projected/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-kube-api-access-slhsk\") pod \"configure-network-openstack-openstack-cell1-f2kbb\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.115396 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.316771 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.317142 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.317239 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.318544 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae849d50cccd9d90d9a24c69edbf389e2f39e47b7bce3e5ed5b3695b8a3514ef"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.318602 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://ae849d50cccd9d90d9a24c69edbf389e2f39e47b7bce3e5ed5b3695b8a3514ef" gracePeriod=600 Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.698565 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.703267 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="ae849d50cccd9d90d9a24c69edbf389e2f39e47b7bce3e5ed5b3695b8a3514ef" exitCode=0 Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.703321 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"ae849d50cccd9d90d9a24c69edbf389e2f39e47b7bce3e5ed5b3695b8a3514ef"} Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.705598 4766 scope.go:117] "RemoveContainer" containerID="afd247abfdfaa4e2e44ca18e0f5c8b9568a976a01725f04c324687de6c066d04" Dec 09 05:17:37 crc kubenswrapper[4766]: I1209 05:17:37.712749 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-f2kbb"] Dec 09 05:17:38 crc kubenswrapper[4766]: I1209 05:17:38.716240 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" event={"ID":"ca0814c4-dca6-4dbd-841b-79d94d5dabd5","Type":"ContainerStarted","Data":"d9c1d5d20ede1119ef67255a846f4ad05dcd87862514aeddfb839b8fcc225e5e"} Dec 09 05:17:38 crc kubenswrapper[4766]: I1209 05:17:38.717041 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" event={"ID":"ca0814c4-dca6-4dbd-841b-79d94d5dabd5","Type":"ContainerStarted","Data":"0edb703c9600a66ad6dbbd7231dc30e995c9b89092794abe3964ba7f4d633b31"} Dec 09 05:17:38 crc kubenswrapper[4766]: I1209 05:17:38.721265 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562"} Dec 09 05:17:38 crc kubenswrapper[4766]: I1209 05:17:38.753886 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" podStartSLOduration=2.572885908 podStartE2EDuration="2.753856718s" podCreationTimestamp="2025-12-09 05:17:36 +0000 UTC" firstStartedPulling="2025-12-09 05:17:37.698383798 +0000 UTC m=+7539.407689224" lastFinishedPulling="2025-12-09 05:17:37.879354608 +0000 UTC m=+7539.588660034" observedRunningTime="2025-12-09 05:17:38.742318395 +0000 UTC m=+7540.451623831" watchObservedRunningTime="2025-12-09 05:17:38.753856718 +0000 UTC m=+7540.463162224" Dec 09 05:18:58 crc kubenswrapper[4766]: I1209 05:18:58.669953 4766 generic.go:334] "Generic (PLEG): container finished" podID="ca0814c4-dca6-4dbd-841b-79d94d5dabd5" containerID="d9c1d5d20ede1119ef67255a846f4ad05dcd87862514aeddfb839b8fcc225e5e" exitCode=0 Dec 09 05:18:58 crc kubenswrapper[4766]: I1209 05:18:58.670028 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" event={"ID":"ca0814c4-dca6-4dbd-841b-79d94d5dabd5","Type":"ContainerDied","Data":"d9c1d5d20ede1119ef67255a846f4ad05dcd87862514aeddfb839b8fcc225e5e"} Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.179845 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.333065 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slhsk\" (UniqueName: \"kubernetes.io/projected/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-kube-api-access-slhsk\") pod \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.333205 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ceph\") pod \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.333250 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-inventory\") pod \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.333301 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ssh-key\") pod \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\" (UID: \"ca0814c4-dca6-4dbd-841b-79d94d5dabd5\") " Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.339573 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ceph" (OuterVolumeSpecName: "ceph") pod "ca0814c4-dca6-4dbd-841b-79d94d5dabd5" (UID: "ca0814c4-dca6-4dbd-841b-79d94d5dabd5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.339779 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-kube-api-access-slhsk" (OuterVolumeSpecName: "kube-api-access-slhsk") pod "ca0814c4-dca6-4dbd-841b-79d94d5dabd5" (UID: "ca0814c4-dca6-4dbd-841b-79d94d5dabd5"). InnerVolumeSpecName "kube-api-access-slhsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.364660 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ca0814c4-dca6-4dbd-841b-79d94d5dabd5" (UID: "ca0814c4-dca6-4dbd-841b-79d94d5dabd5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.369439 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-inventory" (OuterVolumeSpecName: "inventory") pod "ca0814c4-dca6-4dbd-841b-79d94d5dabd5" (UID: "ca0814c4-dca6-4dbd-841b-79d94d5dabd5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.436802 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slhsk\" (UniqueName: \"kubernetes.io/projected/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-kube-api-access-slhsk\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.436859 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.436880 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.436898 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ca0814c4-dca6-4dbd-841b-79d94d5dabd5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.697683 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" event={"ID":"ca0814c4-dca6-4dbd-841b-79d94d5dabd5","Type":"ContainerDied","Data":"0edb703c9600a66ad6dbbd7231dc30e995c9b89092794abe3964ba7f4d633b31"} Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.697745 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0edb703c9600a66ad6dbbd7231dc30e995c9b89092794abe3964ba7f4d633b31" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.697823 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-f2kbb" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.826059 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-gmrnw"] Dec 09 05:19:00 crc kubenswrapper[4766]: E1209 05:19:00.826657 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0814c4-dca6-4dbd-841b-79d94d5dabd5" containerName="configure-network-openstack-openstack-cell1" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.826687 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0814c4-dca6-4dbd-841b-79d94d5dabd5" containerName="configure-network-openstack-openstack-cell1" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.826923 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca0814c4-dca6-4dbd-841b-79d94d5dabd5" containerName="configure-network-openstack-openstack-cell1" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.827755 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.831481 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.831608 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.832432 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.833010 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.845979 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d9gk\" (UniqueName: \"kubernetes.io/projected/80ed47d3-1b67-46bf-a503-75452343db85-kube-api-access-2d9gk\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.848160 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-inventory\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.848465 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ssh-key\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.848591 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ceph\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.853644 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-gmrnw"] Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.953313 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-inventory\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.953557 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ssh-key\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.953609 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ceph\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.953860 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d9gk\" (UniqueName: \"kubernetes.io/projected/80ed47d3-1b67-46bf-a503-75452343db85-kube-api-access-2d9gk\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.959040 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ceph\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.959618 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ssh-key\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.961436 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-inventory\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:00 crc kubenswrapper[4766]: I1209 05:19:00.973668 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d9gk\" (UniqueName: \"kubernetes.io/projected/80ed47d3-1b67-46bf-a503-75452343db85-kube-api-access-2d9gk\") pod \"validate-network-openstack-openstack-cell1-gmrnw\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:01 crc kubenswrapper[4766]: I1209 05:19:01.153895 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:01 crc kubenswrapper[4766]: I1209 05:19:01.792599 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-gmrnw"] Dec 09 05:19:01 crc kubenswrapper[4766]: W1209 05:19:01.801361 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80ed47d3_1b67_46bf_a503_75452343db85.slice/crio-8bc69a1c209d8acbf7c562946d02420007591b533c9252aefdb5db88e3b025c2 WatchSource:0}: Error finding container 8bc69a1c209d8acbf7c562946d02420007591b533c9252aefdb5db88e3b025c2: Status 404 returned error can't find the container with id 8bc69a1c209d8acbf7c562946d02420007591b533c9252aefdb5db88e3b025c2 Dec 09 05:19:02 crc kubenswrapper[4766]: I1209 05:19:02.717968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" event={"ID":"80ed47d3-1b67-46bf-a503-75452343db85","Type":"ContainerStarted","Data":"c7a1f70ac8ef6567a0095521762ade5dc52f1701e4cf8800691f659c384c8369"} Dec 09 05:19:02 crc kubenswrapper[4766]: I1209 05:19:02.718359 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" event={"ID":"80ed47d3-1b67-46bf-a503-75452343db85","Type":"ContainerStarted","Data":"8bc69a1c209d8acbf7c562946d02420007591b533c9252aefdb5db88e3b025c2"} Dec 09 05:19:02 crc kubenswrapper[4766]: I1209 05:19:02.750720 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" podStartSLOduration=2.562796584 podStartE2EDuration="2.750701862s" podCreationTimestamp="2025-12-09 05:19:00 +0000 UTC" firstStartedPulling="2025-12-09 05:19:01.81000284 +0000 UTC m=+7623.519308266" lastFinishedPulling="2025-12-09 05:19:01.997908128 +0000 UTC m=+7623.707213544" observedRunningTime="2025-12-09 05:19:02.742185441 +0000 UTC m=+7624.451490887" watchObservedRunningTime="2025-12-09 05:19:02.750701862 +0000 UTC m=+7624.460007288" Dec 09 05:19:07 crc kubenswrapper[4766]: I1209 05:19:07.788020 4766 generic.go:334] "Generic (PLEG): container finished" podID="80ed47d3-1b67-46bf-a503-75452343db85" containerID="c7a1f70ac8ef6567a0095521762ade5dc52f1701e4cf8800691f659c384c8369" exitCode=0 Dec 09 05:19:07 crc kubenswrapper[4766]: I1209 05:19:07.788659 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" event={"ID":"80ed47d3-1b67-46bf-a503-75452343db85","Type":"ContainerDied","Data":"c7a1f70ac8ef6567a0095521762ade5dc52f1701e4cf8800691f659c384c8369"} Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.305608 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.441545 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ssh-key\") pod \"80ed47d3-1b67-46bf-a503-75452343db85\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.441681 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ceph\") pod \"80ed47d3-1b67-46bf-a503-75452343db85\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.441729 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d9gk\" (UniqueName: \"kubernetes.io/projected/80ed47d3-1b67-46bf-a503-75452343db85-kube-api-access-2d9gk\") pod \"80ed47d3-1b67-46bf-a503-75452343db85\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.441949 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-inventory\") pod \"80ed47d3-1b67-46bf-a503-75452343db85\" (UID: \"80ed47d3-1b67-46bf-a503-75452343db85\") " Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.447931 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ed47d3-1b67-46bf-a503-75452343db85-kube-api-access-2d9gk" (OuterVolumeSpecName: "kube-api-access-2d9gk") pod "80ed47d3-1b67-46bf-a503-75452343db85" (UID: "80ed47d3-1b67-46bf-a503-75452343db85"). InnerVolumeSpecName "kube-api-access-2d9gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.450890 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ceph" (OuterVolumeSpecName: "ceph") pod "80ed47d3-1b67-46bf-a503-75452343db85" (UID: "80ed47d3-1b67-46bf-a503-75452343db85"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.479113 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-inventory" (OuterVolumeSpecName: "inventory") pod "80ed47d3-1b67-46bf-a503-75452343db85" (UID: "80ed47d3-1b67-46bf-a503-75452343db85"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.486629 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "80ed47d3-1b67-46bf-a503-75452343db85" (UID: "80ed47d3-1b67-46bf-a503-75452343db85"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.545736 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.545779 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.545791 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/80ed47d3-1b67-46bf-a503-75452343db85-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.545803 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d9gk\" (UniqueName: \"kubernetes.io/projected/80ed47d3-1b67-46bf-a503-75452343db85-kube-api-access-2d9gk\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.832501 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" event={"ID":"80ed47d3-1b67-46bf-a503-75452343db85","Type":"ContainerDied","Data":"8bc69a1c209d8acbf7c562946d02420007591b533c9252aefdb5db88e3b025c2"} Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.832553 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bc69a1c209d8acbf7c562946d02420007591b533c9252aefdb5db88e3b025c2" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.832639 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-gmrnw" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.900388 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-zr46j"] Dec 09 05:19:09 crc kubenswrapper[4766]: E1209 05:19:09.901074 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ed47d3-1b67-46bf-a503-75452343db85" containerName="validate-network-openstack-openstack-cell1" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.901108 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ed47d3-1b67-46bf-a503-75452343db85" containerName="validate-network-openstack-openstack-cell1" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.901432 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ed47d3-1b67-46bf-a503-75452343db85" containerName="validate-network-openstack-openstack-cell1" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.902533 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.905954 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.906322 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.908079 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.908266 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:19:09 crc kubenswrapper[4766]: I1209 05:19:09.914608 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-zr46j"] Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.057417 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-inventory\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.057884 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ceph\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.058105 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ssh-key\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.058149 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp9rf\" (UniqueName: \"kubernetes.io/projected/65aff17e-b3a1-487f-98fc-c6a81aaf1029-kube-api-access-dp9rf\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.161031 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ssh-key\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.161118 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp9rf\" (UniqueName: \"kubernetes.io/projected/65aff17e-b3a1-487f-98fc-c6a81aaf1029-kube-api-access-dp9rf\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.161299 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-inventory\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.161405 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ceph\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.166867 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-inventory\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.167735 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ceph\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.171771 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ssh-key\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.179920 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp9rf\" (UniqueName: \"kubernetes.io/projected/65aff17e-b3a1-487f-98fc-c6a81aaf1029-kube-api-access-dp9rf\") pod \"install-os-openstack-openstack-cell1-zr46j\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.230943 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:10 crc kubenswrapper[4766]: I1209 05:19:10.906942 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-zr46j"] Dec 09 05:19:11 crc kubenswrapper[4766]: I1209 05:19:11.889658 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-zr46j" event={"ID":"65aff17e-b3a1-487f-98fc-c6a81aaf1029","Type":"ContainerStarted","Data":"5cfcec15ccfa03b4e56e29b0758c30d1e7db19d5a63ac5fdf12670b943f04111"} Dec 09 05:19:11 crc kubenswrapper[4766]: I1209 05:19:11.890303 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-zr46j" event={"ID":"65aff17e-b3a1-487f-98fc-c6a81aaf1029","Type":"ContainerStarted","Data":"1d46b616e29df43e92f319835bfab5e8e6216b24a95f039b0275328b8772a8f7"} Dec 09 05:19:11 crc kubenswrapper[4766]: I1209 05:19:11.918386 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-zr46j" podStartSLOduration=2.719492143 podStartE2EDuration="2.918369388s" podCreationTimestamp="2025-12-09 05:19:09 +0000 UTC" firstStartedPulling="2025-12-09 05:19:10.920522588 +0000 UTC m=+7632.629828014" lastFinishedPulling="2025-12-09 05:19:11.119399823 +0000 UTC m=+7632.828705259" observedRunningTime="2025-12-09 05:19:11.909318703 +0000 UTC m=+7633.618624129" watchObservedRunningTime="2025-12-09 05:19:11.918369388 +0000 UTC m=+7633.627674804" Dec 09 05:19:57 crc kubenswrapper[4766]: I1209 05:19:57.390207 4766 generic.go:334] "Generic (PLEG): container finished" podID="65aff17e-b3a1-487f-98fc-c6a81aaf1029" containerID="5cfcec15ccfa03b4e56e29b0758c30d1e7db19d5a63ac5fdf12670b943f04111" exitCode=0 Dec 09 05:19:57 crc kubenswrapper[4766]: I1209 05:19:57.390828 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-zr46j" event={"ID":"65aff17e-b3a1-487f-98fc-c6a81aaf1029","Type":"ContainerDied","Data":"5cfcec15ccfa03b4e56e29b0758c30d1e7db19d5a63ac5fdf12670b943f04111"} Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.610505 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sxgzr"] Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.614838 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.621315 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxgzr"] Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.762786 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-utilities\") pod \"community-operators-sxgzr\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.763395 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-catalog-content\") pod \"community-operators-sxgzr\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.763447 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7n6\" (UniqueName: \"kubernetes.io/projected/78716cbd-4c08-44c2-941c-f1d1019d5146-kube-api-access-jn7n6\") pod \"community-operators-sxgzr\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.871160 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-catalog-content\") pod \"community-operators-sxgzr\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.871255 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7n6\" (UniqueName: \"kubernetes.io/projected/78716cbd-4c08-44c2-941c-f1d1019d5146-kube-api-access-jn7n6\") pod \"community-operators-sxgzr\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.871371 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-utilities\") pod \"community-operators-sxgzr\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.872010 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-utilities\") pod \"community-operators-sxgzr\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.872290 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-catalog-content\") pod \"community-operators-sxgzr\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.895614 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7n6\" (UniqueName: \"kubernetes.io/projected/78716cbd-4c08-44c2-941c-f1d1019d5146-kube-api-access-jn7n6\") pod \"community-operators-sxgzr\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.951191 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:19:58 crc kubenswrapper[4766]: I1209 05:19:58.963315 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.076249 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ceph\") pod \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.076514 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ssh-key\") pod \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.076576 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-inventory\") pod \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.076621 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp9rf\" (UniqueName: \"kubernetes.io/projected/65aff17e-b3a1-487f-98fc-c6a81aaf1029-kube-api-access-dp9rf\") pod \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\" (UID: \"65aff17e-b3a1-487f-98fc-c6a81aaf1029\") " Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.081389 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ceph" (OuterVolumeSpecName: "ceph") pod "65aff17e-b3a1-487f-98fc-c6a81aaf1029" (UID: "65aff17e-b3a1-487f-98fc-c6a81aaf1029"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.081547 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65aff17e-b3a1-487f-98fc-c6a81aaf1029-kube-api-access-dp9rf" (OuterVolumeSpecName: "kube-api-access-dp9rf") pod "65aff17e-b3a1-487f-98fc-c6a81aaf1029" (UID: "65aff17e-b3a1-487f-98fc-c6a81aaf1029"). InnerVolumeSpecName "kube-api-access-dp9rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.110548 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "65aff17e-b3a1-487f-98fc-c6a81aaf1029" (UID: "65aff17e-b3a1-487f-98fc-c6a81aaf1029"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.120004 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-inventory" (OuterVolumeSpecName: "inventory") pod "65aff17e-b3a1-487f-98fc-c6a81aaf1029" (UID: "65aff17e-b3a1-487f-98fc-c6a81aaf1029"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.179529 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.179557 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.179568 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65aff17e-b3a1-487f-98fc-c6a81aaf1029-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.179578 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp9rf\" (UniqueName: \"kubernetes.io/projected/65aff17e-b3a1-487f-98fc-c6a81aaf1029-kube-api-access-dp9rf\") on node \"crc\" DevicePath \"\"" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.413264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-zr46j" event={"ID":"65aff17e-b3a1-487f-98fc-c6a81aaf1029","Type":"ContainerDied","Data":"1d46b616e29df43e92f319835bfab5e8e6216b24a95f039b0275328b8772a8f7"} Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.413320 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d46b616e29df43e92f319835bfab5e8e6216b24a95f039b0275328b8772a8f7" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.413340 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-zr46j" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.452299 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxgzr"] Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.559011 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-gvx4s"] Dec 09 05:19:59 crc kubenswrapper[4766]: E1209 05:19:59.559962 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65aff17e-b3a1-487f-98fc-c6a81aaf1029" containerName="install-os-openstack-openstack-cell1" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.560040 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="65aff17e-b3a1-487f-98fc-c6a81aaf1029" containerName="install-os-openstack-openstack-cell1" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.560409 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="65aff17e-b3a1-487f-98fc-c6a81aaf1029" containerName="install-os-openstack-openstack-cell1" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.561235 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.564860 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.564950 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.565707 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.567484 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.596845 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-gvx4s"] Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.691538 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-inventory\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.691597 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww277\" (UniqueName: \"kubernetes.io/projected/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-kube-api-access-ww277\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.691830 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ceph\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.691933 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ssh-key\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.794434 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-inventory\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.794490 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww277\" (UniqueName: \"kubernetes.io/projected/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-kube-api-access-ww277\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.794538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ceph\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.794569 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ssh-key\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.799056 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-inventory\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.799320 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ssh-key\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.799506 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ceph\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.817159 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww277\" (UniqueName: \"kubernetes.io/projected/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-kube-api-access-ww277\") pod \"configure-os-openstack-openstack-cell1-gvx4s\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:19:59 crc kubenswrapper[4766]: I1209 05:19:59.897731 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:20:00 crc kubenswrapper[4766]: I1209 05:20:00.426732 4766 generic.go:334] "Generic (PLEG): container finished" podID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerID="2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b" exitCode=0 Dec 09 05:20:00 crc kubenswrapper[4766]: I1209 05:20:00.427030 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxgzr" event={"ID":"78716cbd-4c08-44c2-941c-f1d1019d5146","Type":"ContainerDied","Data":"2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b"} Dec 09 05:20:00 crc kubenswrapper[4766]: I1209 05:20:00.427068 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxgzr" event={"ID":"78716cbd-4c08-44c2-941c-f1d1019d5146","Type":"ContainerStarted","Data":"7622f024433a059fd4df73a0fb618956352a04e2dd51c90859e6919d9acbed3f"} Dec 09 05:20:00 crc kubenswrapper[4766]: I1209 05:20:00.534328 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-gvx4s"] Dec 09 05:20:00 crc kubenswrapper[4766]: W1209 05:20:00.538827 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea6ef63b_4223_4237_9ba9_d389f0b11cc5.slice/crio-e3dc7ceac4cb2eb08384b9cc6f75620a1f8e6a9d34e898ddfcac3131b218b8a2 WatchSource:0}: Error finding container e3dc7ceac4cb2eb08384b9cc6f75620a1f8e6a9d34e898ddfcac3131b218b8a2: Status 404 returned error can't find the container with id e3dc7ceac4cb2eb08384b9cc6f75620a1f8e6a9d34e898ddfcac3131b218b8a2 Dec 09 05:20:01 crc kubenswrapper[4766]: I1209 05:20:01.460725 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" event={"ID":"ea6ef63b-4223-4237-9ba9-d389f0b11cc5","Type":"ContainerStarted","Data":"4ec5075c035443c1682248516aed59cb5fcad432b82d9292c18e7814279eccc4"} Dec 09 05:20:01 crc kubenswrapper[4766]: I1209 05:20:01.462453 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" event={"ID":"ea6ef63b-4223-4237-9ba9-d389f0b11cc5","Type":"ContainerStarted","Data":"e3dc7ceac4cb2eb08384b9cc6f75620a1f8e6a9d34e898ddfcac3131b218b8a2"} Dec 09 05:20:01 crc kubenswrapper[4766]: I1209 05:20:01.497395 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" podStartSLOduration=2.289708397 podStartE2EDuration="2.49736604s" podCreationTimestamp="2025-12-09 05:19:59 +0000 UTC" firstStartedPulling="2025-12-09 05:20:00.541902177 +0000 UTC m=+7682.251207603" lastFinishedPulling="2025-12-09 05:20:00.74955979 +0000 UTC m=+7682.458865246" observedRunningTime="2025-12-09 05:20:01.487649987 +0000 UTC m=+7683.196955423" watchObservedRunningTime="2025-12-09 05:20:01.49736604 +0000 UTC m=+7683.206671506" Dec 09 05:20:02 crc kubenswrapper[4766]: I1209 05:20:02.484090 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxgzr" event={"ID":"78716cbd-4c08-44c2-941c-f1d1019d5146","Type":"ContainerStarted","Data":"037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b"} Dec 09 05:20:03 crc kubenswrapper[4766]: I1209 05:20:03.492842 4766 generic.go:334] "Generic (PLEG): container finished" podID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerID="037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b" exitCode=0 Dec 09 05:20:03 crc kubenswrapper[4766]: I1209 05:20:03.492885 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxgzr" event={"ID":"78716cbd-4c08-44c2-941c-f1d1019d5146","Type":"ContainerDied","Data":"037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b"} Dec 09 05:20:04 crc kubenswrapper[4766]: I1209 05:20:04.511673 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxgzr" event={"ID":"78716cbd-4c08-44c2-941c-f1d1019d5146","Type":"ContainerStarted","Data":"16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836"} Dec 09 05:20:04 crc kubenswrapper[4766]: I1209 05:20:04.538505 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sxgzr" podStartSLOduration=3.08340338 podStartE2EDuration="6.538474818s" podCreationTimestamp="2025-12-09 05:19:58 +0000 UTC" firstStartedPulling="2025-12-09 05:20:00.429027351 +0000 UTC m=+7682.138332797" lastFinishedPulling="2025-12-09 05:20:03.884098759 +0000 UTC m=+7685.593404235" observedRunningTime="2025-12-09 05:20:04.530377679 +0000 UTC m=+7686.239683145" watchObservedRunningTime="2025-12-09 05:20:04.538474818 +0000 UTC m=+7686.247780294" Dec 09 05:20:07 crc kubenswrapper[4766]: I1209 05:20:07.316034 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:20:07 crc kubenswrapper[4766]: I1209 05:20:07.316522 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:20:08 crc kubenswrapper[4766]: I1209 05:20:08.951971 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:20:08 crc kubenswrapper[4766]: I1209 05:20:08.953041 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:20:09 crc kubenswrapper[4766]: I1209 05:20:09.013936 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:20:09 crc kubenswrapper[4766]: I1209 05:20:09.668348 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:20:09 crc kubenswrapper[4766]: I1209 05:20:09.748776 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxgzr"] Dec 09 05:20:11 crc kubenswrapper[4766]: I1209 05:20:11.591534 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sxgzr" podUID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerName="registry-server" containerID="cri-o://16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836" gracePeriod=2 Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.165452 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.294160 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-utilities\") pod \"78716cbd-4c08-44c2-941c-f1d1019d5146\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.294311 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-catalog-content\") pod \"78716cbd-4c08-44c2-941c-f1d1019d5146\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.294536 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn7n6\" (UniqueName: \"kubernetes.io/projected/78716cbd-4c08-44c2-941c-f1d1019d5146-kube-api-access-jn7n6\") pod \"78716cbd-4c08-44c2-941c-f1d1019d5146\" (UID: \"78716cbd-4c08-44c2-941c-f1d1019d5146\") " Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.295302 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-utilities" (OuterVolumeSpecName: "utilities") pod "78716cbd-4c08-44c2-941c-f1d1019d5146" (UID: "78716cbd-4c08-44c2-941c-f1d1019d5146"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.303002 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78716cbd-4c08-44c2-941c-f1d1019d5146-kube-api-access-jn7n6" (OuterVolumeSpecName: "kube-api-access-jn7n6") pod "78716cbd-4c08-44c2-941c-f1d1019d5146" (UID: "78716cbd-4c08-44c2-941c-f1d1019d5146"). InnerVolumeSpecName "kube-api-access-jn7n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.346779 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78716cbd-4c08-44c2-941c-f1d1019d5146" (UID: "78716cbd-4c08-44c2-941c-f1d1019d5146"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.397385 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn7n6\" (UniqueName: \"kubernetes.io/projected/78716cbd-4c08-44c2-941c-f1d1019d5146-kube-api-access-jn7n6\") on node \"crc\" DevicePath \"\"" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.397434 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.397446 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78716cbd-4c08-44c2-941c-f1d1019d5146-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.605786 4766 generic.go:334] "Generic (PLEG): container finished" podID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerID="16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836" exitCode=0 Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.605857 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxgzr" event={"ID":"78716cbd-4c08-44c2-941c-f1d1019d5146","Type":"ContainerDied","Data":"16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836"} Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.605914 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxgzr" event={"ID":"78716cbd-4c08-44c2-941c-f1d1019d5146","Type":"ContainerDied","Data":"7622f024433a059fd4df73a0fb618956352a04e2dd51c90859e6919d9acbed3f"} Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.605951 4766 scope.go:117] "RemoveContainer" containerID="16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.609405 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxgzr" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.640760 4766 scope.go:117] "RemoveContainer" containerID="037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.655421 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxgzr"] Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.666965 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sxgzr"] Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.681157 4766 scope.go:117] "RemoveContainer" containerID="2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.752895 4766 scope.go:117] "RemoveContainer" containerID="16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836" Dec 09 05:20:12 crc kubenswrapper[4766]: E1209 05:20:12.753474 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836\": container with ID starting with 16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836 not found: ID does not exist" containerID="16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.753523 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836"} err="failed to get container status \"16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836\": rpc error: code = NotFound desc = could not find container \"16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836\": container with ID starting with 16f94fd48620ff91eca708fe8d7a83c410af939c6b70e85c8ba04da94cd9f836 not found: ID does not exist" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.753552 4766 scope.go:117] "RemoveContainer" containerID="037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b" Dec 09 05:20:12 crc kubenswrapper[4766]: E1209 05:20:12.753993 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b\": container with ID starting with 037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b not found: ID does not exist" containerID="037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.754026 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b"} err="failed to get container status \"037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b\": rpc error: code = NotFound desc = could not find container \"037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b\": container with ID starting with 037bb1885312a25bccb24e91d613d798381aa16046a5cd65110bb07fbd89094b not found: ID does not exist" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.754053 4766 scope.go:117] "RemoveContainer" containerID="2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b" Dec 09 05:20:12 crc kubenswrapper[4766]: E1209 05:20:12.754383 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b\": container with ID starting with 2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b not found: ID does not exist" containerID="2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.754413 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b"} err="failed to get container status \"2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b\": rpc error: code = NotFound desc = could not find container \"2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b\": container with ID starting with 2233cd3e8b07f67f0699758df925729b7596e6bf44402361633d971be5be162b not found: ID does not exist" Dec 09 05:20:12 crc kubenswrapper[4766]: I1209 05:20:12.861898 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78716cbd-4c08-44c2-941c-f1d1019d5146" path="/var/lib/kubelet/pods/78716cbd-4c08-44c2-941c-f1d1019d5146/volumes" Dec 09 05:20:37 crc kubenswrapper[4766]: I1209 05:20:37.316444 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:20:37 crc kubenswrapper[4766]: I1209 05:20:37.317453 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:20:47 crc kubenswrapper[4766]: I1209 05:20:47.041136 4766 generic.go:334] "Generic (PLEG): container finished" podID="ea6ef63b-4223-4237-9ba9-d389f0b11cc5" containerID="4ec5075c035443c1682248516aed59cb5fcad432b82d9292c18e7814279eccc4" exitCode=0 Dec 09 05:20:47 crc kubenswrapper[4766]: I1209 05:20:47.041318 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" event={"ID":"ea6ef63b-4223-4237-9ba9-d389f0b11cc5","Type":"ContainerDied","Data":"4ec5075c035443c1682248516aed59cb5fcad432b82d9292c18e7814279eccc4"} Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.482408 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.669883 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-inventory\") pod \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.669982 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ssh-key\") pod \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.670115 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww277\" (UniqueName: \"kubernetes.io/projected/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-kube-api-access-ww277\") pod \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.670231 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ceph\") pod \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\" (UID: \"ea6ef63b-4223-4237-9ba9-d389f0b11cc5\") " Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.676784 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ceph" (OuterVolumeSpecName: "ceph") pod "ea6ef63b-4223-4237-9ba9-d389f0b11cc5" (UID: "ea6ef63b-4223-4237-9ba9-d389f0b11cc5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.677253 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-kube-api-access-ww277" (OuterVolumeSpecName: "kube-api-access-ww277") pod "ea6ef63b-4223-4237-9ba9-d389f0b11cc5" (UID: "ea6ef63b-4223-4237-9ba9-d389f0b11cc5"). InnerVolumeSpecName "kube-api-access-ww277". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.702264 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea6ef63b-4223-4237-9ba9-d389f0b11cc5" (UID: "ea6ef63b-4223-4237-9ba9-d389f0b11cc5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.707586 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-inventory" (OuterVolumeSpecName: "inventory") pod "ea6ef63b-4223-4237-9ba9-d389f0b11cc5" (UID: "ea6ef63b-4223-4237-9ba9-d389f0b11cc5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.772698 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.772737 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.772749 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww277\" (UniqueName: \"kubernetes.io/projected/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-kube-api-access-ww277\") on node \"crc\" DevicePath \"\"" Dec 09 05:20:48 crc kubenswrapper[4766]: I1209 05:20:48.772758 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea6ef63b-4223-4237-9ba9-d389f0b11cc5-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.062400 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" event={"ID":"ea6ef63b-4223-4237-9ba9-d389f0b11cc5","Type":"ContainerDied","Data":"e3dc7ceac4cb2eb08384b9cc6f75620a1f8e6a9d34e898ddfcac3131b218b8a2"} Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.062844 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3dc7ceac4cb2eb08384b9cc6f75620a1f8e6a9d34e898ddfcac3131b218b8a2" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.062480 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-gvx4s" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.201535 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-vpkgx"] Dec 09 05:20:49 crc kubenswrapper[4766]: E1209 05:20:49.201953 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6ef63b-4223-4237-9ba9-d389f0b11cc5" containerName="configure-os-openstack-openstack-cell1" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.201970 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6ef63b-4223-4237-9ba9-d389f0b11cc5" containerName="configure-os-openstack-openstack-cell1" Dec 09 05:20:49 crc kubenswrapper[4766]: E1209 05:20:49.202007 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerName="extract-utilities" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.202017 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerName="extract-utilities" Dec 09 05:20:49 crc kubenswrapper[4766]: E1209 05:20:49.202031 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerName="registry-server" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.202038 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerName="registry-server" Dec 09 05:20:49 crc kubenswrapper[4766]: E1209 05:20:49.202053 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerName="extract-content" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.202060 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerName="extract-content" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.202261 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6ef63b-4223-4237-9ba9-d389f0b11cc5" containerName="configure-os-openstack-openstack-cell1" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.202302 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="78716cbd-4c08-44c2-941c-f1d1019d5146" containerName="registry-server" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.203056 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.205458 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.205496 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.206513 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.212640 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.213386 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-vpkgx"] Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.387325 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv5bq\" (UniqueName: \"kubernetes.io/projected/322ba64a-2659-4d62-b831-219df10e7c72-kube-api-access-cv5bq\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.387388 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-inventory-0\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.387525 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.387635 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ceph\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.490058 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ceph\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.490286 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv5bq\" (UniqueName: \"kubernetes.io/projected/322ba64a-2659-4d62-b831-219df10e7c72-kube-api-access-cv5bq\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.490375 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-inventory-0\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.490610 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.495557 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-inventory-0\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.495676 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ceph\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.499752 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.511805 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv5bq\" (UniqueName: \"kubernetes.io/projected/322ba64a-2659-4d62-b831-219df10e7c72-kube-api-access-cv5bq\") pod \"ssh-known-hosts-openstack-vpkgx\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:49 crc kubenswrapper[4766]: I1209 05:20:49.520980 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:20:50 crc kubenswrapper[4766]: I1209 05:20:50.098202 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-vpkgx"] Dec 09 05:20:50 crc kubenswrapper[4766]: W1209 05:20:50.110180 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod322ba64a_2659_4d62_b831_219df10e7c72.slice/crio-046de571f011b833f02bc0b5cf1d962f78ec57e123c85c6e98b77289157bf4c3 WatchSource:0}: Error finding container 046de571f011b833f02bc0b5cf1d962f78ec57e123c85c6e98b77289157bf4c3: Status 404 returned error can't find the container with id 046de571f011b833f02bc0b5cf1d962f78ec57e123c85c6e98b77289157bf4c3 Dec 09 05:20:51 crc kubenswrapper[4766]: I1209 05:20:51.085491 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-vpkgx" event={"ID":"322ba64a-2659-4d62-b831-219df10e7c72","Type":"ContainerStarted","Data":"098e8ded7a9e8e7227ac0c57cefe224a8f30ee6b160fa01fd8679a10f536e4d8"} Dec 09 05:20:51 crc kubenswrapper[4766]: I1209 05:20:51.085853 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-vpkgx" event={"ID":"322ba64a-2659-4d62-b831-219df10e7c72","Type":"ContainerStarted","Data":"046de571f011b833f02bc0b5cf1d962f78ec57e123c85c6e98b77289157bf4c3"} Dec 09 05:20:51 crc kubenswrapper[4766]: I1209 05:20:51.110865 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-vpkgx" podStartSLOduration=1.9625488500000001 podStartE2EDuration="2.110818885s" podCreationTimestamp="2025-12-09 05:20:49 +0000 UTC" firstStartedPulling="2025-12-09 05:20:50.11390718 +0000 UTC m=+7731.823212606" lastFinishedPulling="2025-12-09 05:20:50.262177215 +0000 UTC m=+7731.971482641" observedRunningTime="2025-12-09 05:20:51.105079789 +0000 UTC m=+7732.814385205" watchObservedRunningTime="2025-12-09 05:20:51.110818885 +0000 UTC m=+7732.820124321" Dec 09 05:21:00 crc kubenswrapper[4766]: I1209 05:21:00.231546 4766 generic.go:334] "Generic (PLEG): container finished" podID="322ba64a-2659-4d62-b831-219df10e7c72" containerID="098e8ded7a9e8e7227ac0c57cefe224a8f30ee6b160fa01fd8679a10f536e4d8" exitCode=0 Dec 09 05:21:00 crc kubenswrapper[4766]: I1209 05:21:00.231664 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-vpkgx" event={"ID":"322ba64a-2659-4d62-b831-219df10e7c72","Type":"ContainerDied","Data":"098e8ded7a9e8e7227ac0c57cefe224a8f30ee6b160fa01fd8679a10f536e4d8"} Dec 09 05:21:01 crc kubenswrapper[4766]: I1209 05:21:01.851674 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:21:01 crc kubenswrapper[4766]: I1209 05:21:01.961268 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv5bq\" (UniqueName: \"kubernetes.io/projected/322ba64a-2659-4d62-b831-219df10e7c72-kube-api-access-cv5bq\") pod \"322ba64a-2659-4d62-b831-219df10e7c72\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " Dec 09 05:21:01 crc kubenswrapper[4766]: I1209 05:21:01.961502 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-inventory-0\") pod \"322ba64a-2659-4d62-b831-219df10e7c72\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " Dec 09 05:21:01 crc kubenswrapper[4766]: I1209 05:21:01.961577 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ssh-key-openstack-cell1\") pod \"322ba64a-2659-4d62-b831-219df10e7c72\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " Dec 09 05:21:01 crc kubenswrapper[4766]: I1209 05:21:01.961634 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ceph\") pod \"322ba64a-2659-4d62-b831-219df10e7c72\" (UID: \"322ba64a-2659-4d62-b831-219df10e7c72\") " Dec 09 05:21:01 crc kubenswrapper[4766]: I1209 05:21:01.967587 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/322ba64a-2659-4d62-b831-219df10e7c72-kube-api-access-cv5bq" (OuterVolumeSpecName: "kube-api-access-cv5bq") pod "322ba64a-2659-4d62-b831-219df10e7c72" (UID: "322ba64a-2659-4d62-b831-219df10e7c72"). InnerVolumeSpecName "kube-api-access-cv5bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:21:01 crc kubenswrapper[4766]: I1209 05:21:01.967704 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ceph" (OuterVolumeSpecName: "ceph") pod "322ba64a-2659-4d62-b831-219df10e7c72" (UID: "322ba64a-2659-4d62-b831-219df10e7c72"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:01 crc kubenswrapper[4766]: I1209 05:21:01.998482 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "322ba64a-2659-4d62-b831-219df10e7c72" (UID: "322ba64a-2659-4d62-b831-219df10e7c72"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:01 crc kubenswrapper[4766]: I1209 05:21:01.999236 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "322ba64a-2659-4d62-b831-219df10e7c72" (UID: "322ba64a-2659-4d62-b831-219df10e7c72"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.064332 4766 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.064381 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.064400 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/322ba64a-2659-4d62-b831-219df10e7c72-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.064420 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv5bq\" (UniqueName: \"kubernetes.io/projected/322ba64a-2659-4d62-b831-219df10e7c72-kube-api-access-cv5bq\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.263688 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-vpkgx" event={"ID":"322ba64a-2659-4d62-b831-219df10e7c72","Type":"ContainerDied","Data":"046de571f011b833f02bc0b5cf1d962f78ec57e123c85c6e98b77289157bf4c3"} Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.263742 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="046de571f011b833f02bc0b5cf1d962f78ec57e123c85c6e98b77289157bf4c3" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.263758 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-vpkgx" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.342751 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-sksdt"] Dec 09 05:21:02 crc kubenswrapper[4766]: E1209 05:21:02.343467 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322ba64a-2659-4d62-b831-219df10e7c72" containerName="ssh-known-hosts-openstack" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.343495 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="322ba64a-2659-4d62-b831-219df10e7c72" containerName="ssh-known-hosts-openstack" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.343886 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="322ba64a-2659-4d62-b831-219df10e7c72" containerName="ssh-known-hosts-openstack" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.345233 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.347922 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.347969 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.347926 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.348457 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.363012 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-sksdt"] Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.473120 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ceph\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.473242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ssh-key\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.473290 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnb6c\" (UniqueName: \"kubernetes.io/projected/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-kube-api-access-tnb6c\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.473400 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-inventory\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.575582 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ceph\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.575887 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ssh-key\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.575921 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnb6c\" (UniqueName: \"kubernetes.io/projected/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-kube-api-access-tnb6c\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.575975 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-inventory\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.580368 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-inventory\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.582474 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ssh-key\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.583331 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ceph\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.594540 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnb6c\" (UniqueName: \"kubernetes.io/projected/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-kube-api-access-tnb6c\") pod \"run-os-openstack-openstack-cell1-sksdt\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:02 crc kubenswrapper[4766]: I1209 05:21:02.676075 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:03 crc kubenswrapper[4766]: I1209 05:21:03.357882 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-sksdt"] Dec 09 05:21:04 crc kubenswrapper[4766]: I1209 05:21:04.285660 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-sksdt" event={"ID":"fa7129ae-c7da-4b11-ade0-9d16ae5373d5","Type":"ContainerStarted","Data":"1e0e66dafb2ca10674b5acd9bbd2a00dec35758cce4d40d9014dc5ebffa9d56a"} Dec 09 05:21:04 crc kubenswrapper[4766]: I1209 05:21:04.286177 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-sksdt" event={"ID":"fa7129ae-c7da-4b11-ade0-9d16ae5373d5","Type":"ContainerStarted","Data":"0a928ababdc035721f3c1e99212d640676c7a0e5e32ade6a867466b0b7567e1b"} Dec 09 05:21:04 crc kubenswrapper[4766]: I1209 05:21:04.308525 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-sksdt" podStartSLOduration=2.121196865 podStartE2EDuration="2.308494817s" podCreationTimestamp="2025-12-09 05:21:02 +0000 UTC" firstStartedPulling="2025-12-09 05:21:03.360229889 +0000 UTC m=+7745.069535315" lastFinishedPulling="2025-12-09 05:21:03.547527831 +0000 UTC m=+7745.256833267" observedRunningTime="2025-12-09 05:21:04.304255662 +0000 UTC m=+7746.013561098" watchObservedRunningTime="2025-12-09 05:21:04.308494817 +0000 UTC m=+7746.017800243" Dec 09 05:21:07 crc kubenswrapper[4766]: I1209 05:21:07.317246 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:21:07 crc kubenswrapper[4766]: I1209 05:21:07.317840 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:21:07 crc kubenswrapper[4766]: I1209 05:21:07.317902 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:21:07 crc kubenswrapper[4766]: I1209 05:21:07.318844 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:21:07 crc kubenswrapper[4766]: I1209 05:21:07.318935 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" gracePeriod=600 Dec 09 05:21:07 crc kubenswrapper[4766]: E1209 05:21:07.443341 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:21:08 crc kubenswrapper[4766]: I1209 05:21:08.335841 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" exitCode=0 Dec 09 05:21:08 crc kubenswrapper[4766]: I1209 05:21:08.335900 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562"} Dec 09 05:21:08 crc kubenswrapper[4766]: I1209 05:21:08.335953 4766 scope.go:117] "RemoveContainer" containerID="ae849d50cccd9d90d9a24c69edbf389e2f39e47b7bce3e5ed5b3695b8a3514ef" Dec 09 05:21:08 crc kubenswrapper[4766]: I1209 05:21:08.336639 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:21:08 crc kubenswrapper[4766]: E1209 05:21:08.337070 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:21:12 crc kubenswrapper[4766]: I1209 05:21:12.376872 4766 generic.go:334] "Generic (PLEG): container finished" podID="fa7129ae-c7da-4b11-ade0-9d16ae5373d5" containerID="1e0e66dafb2ca10674b5acd9bbd2a00dec35758cce4d40d9014dc5ebffa9d56a" exitCode=0 Dec 09 05:21:12 crc kubenswrapper[4766]: I1209 05:21:12.377022 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-sksdt" event={"ID":"fa7129ae-c7da-4b11-ade0-9d16ae5373d5","Type":"ContainerDied","Data":"1e0e66dafb2ca10674b5acd9bbd2a00dec35758cce4d40d9014dc5ebffa9d56a"} Dec 09 05:21:13 crc kubenswrapper[4766]: I1209 05:21:13.856814 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.002821 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ssh-key\") pod \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.003022 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnb6c\" (UniqueName: \"kubernetes.io/projected/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-kube-api-access-tnb6c\") pod \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.003193 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-inventory\") pod \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.003263 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ceph\") pod \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\" (UID: \"fa7129ae-c7da-4b11-ade0-9d16ae5373d5\") " Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.020287 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ceph" (OuterVolumeSpecName: "ceph") pod "fa7129ae-c7da-4b11-ade0-9d16ae5373d5" (UID: "fa7129ae-c7da-4b11-ade0-9d16ae5373d5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.030787 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-kube-api-access-tnb6c" (OuterVolumeSpecName: "kube-api-access-tnb6c") pod "fa7129ae-c7da-4b11-ade0-9d16ae5373d5" (UID: "fa7129ae-c7da-4b11-ade0-9d16ae5373d5"). InnerVolumeSpecName "kube-api-access-tnb6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.034798 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fa7129ae-c7da-4b11-ade0-9d16ae5373d5" (UID: "fa7129ae-c7da-4b11-ade0-9d16ae5373d5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.069514 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-inventory" (OuterVolumeSpecName: "inventory") pod "fa7129ae-c7da-4b11-ade0-9d16ae5373d5" (UID: "fa7129ae-c7da-4b11-ade0-9d16ae5373d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.105861 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.105896 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.105905 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.105915 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnb6c\" (UniqueName: \"kubernetes.io/projected/fa7129ae-c7da-4b11-ade0-9d16ae5373d5-kube-api-access-tnb6c\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.396985 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-sksdt" event={"ID":"fa7129ae-c7da-4b11-ade0-9d16ae5373d5","Type":"ContainerDied","Data":"0a928ababdc035721f3c1e99212d640676c7a0e5e32ade6a867466b0b7567e1b"} Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.397027 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a928ababdc035721f3c1e99212d640676c7a0e5e32ade6a867466b0b7567e1b" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.397050 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-sksdt" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.482885 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-fzkl6"] Dec 09 05:21:14 crc kubenswrapper[4766]: E1209 05:21:14.483536 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7129ae-c7da-4b11-ade0-9d16ae5373d5" containerName="run-os-openstack-openstack-cell1" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.483563 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7129ae-c7da-4b11-ade0-9d16ae5373d5" containerName="run-os-openstack-openstack-cell1" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.483846 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7129ae-c7da-4b11-ade0-9d16ae5373d5" containerName="run-os-openstack-openstack-cell1" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.484686 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.487030 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.487228 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.487358 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.489281 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.495369 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-fzkl6"] Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.520715 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-inventory\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.520818 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.520910 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8hp2\" (UniqueName: \"kubernetes.io/projected/ce33c619-39af-44a4-9643-a60c928971b5-kube-api-access-z8hp2\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.521160 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ceph\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.622714 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8hp2\" (UniqueName: \"kubernetes.io/projected/ce33c619-39af-44a4-9643-a60c928971b5-kube-api-access-z8hp2\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.622828 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ceph\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.622887 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-inventory\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.623009 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.626577 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ceph\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.628228 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.637444 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-inventory\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.639025 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8hp2\" (UniqueName: \"kubernetes.io/projected/ce33c619-39af-44a4-9643-a60c928971b5-kube-api-access-z8hp2\") pod \"reboot-os-openstack-openstack-cell1-fzkl6\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:14 crc kubenswrapper[4766]: I1209 05:21:14.802137 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:15 crc kubenswrapper[4766]: I1209 05:21:15.381189 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-fzkl6"] Dec 09 05:21:15 crc kubenswrapper[4766]: I1209 05:21:15.412989 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" event={"ID":"ce33c619-39af-44a4-9643-a60c928971b5","Type":"ContainerStarted","Data":"d211c4cf704ca08634192a49dc6f284bd1221bcc2d3520bb1cc372371000ce52"} Dec 09 05:21:16 crc kubenswrapper[4766]: I1209 05:21:16.422596 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" event={"ID":"ce33c619-39af-44a4-9643-a60c928971b5","Type":"ContainerStarted","Data":"263a74dd81197be76231470d224a51059061a56fc54391b1eb70679edf25d88f"} Dec 09 05:21:16 crc kubenswrapper[4766]: I1209 05:21:16.446312 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" podStartSLOduration=2.315187649 podStartE2EDuration="2.446287729s" podCreationTimestamp="2025-12-09 05:21:14 +0000 UTC" firstStartedPulling="2025-12-09 05:21:15.38604331 +0000 UTC m=+7757.095348736" lastFinishedPulling="2025-12-09 05:21:15.51714339 +0000 UTC m=+7757.226448816" observedRunningTime="2025-12-09 05:21:16.436095924 +0000 UTC m=+7758.145401350" watchObservedRunningTime="2025-12-09 05:21:16.446287729 +0000 UTC m=+7758.155593195" Dec 09 05:21:20 crc kubenswrapper[4766]: I1209 05:21:20.839654 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:21:20 crc kubenswrapper[4766]: E1209 05:21:20.840549 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:21:32 crc kubenswrapper[4766]: I1209 05:21:32.590370 4766 generic.go:334] "Generic (PLEG): container finished" podID="ce33c619-39af-44a4-9643-a60c928971b5" containerID="263a74dd81197be76231470d224a51059061a56fc54391b1eb70679edf25d88f" exitCode=0 Dec 09 05:21:32 crc kubenswrapper[4766]: I1209 05:21:32.590443 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" event={"ID":"ce33c619-39af-44a4-9643-a60c928971b5","Type":"ContainerDied","Data":"263a74dd81197be76231470d224a51059061a56fc54391b1eb70679edf25d88f"} Dec 09 05:21:33 crc kubenswrapper[4766]: I1209 05:21:33.839449 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:21:33 crc kubenswrapper[4766]: E1209 05:21:33.840350 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.117384 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.276846 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-inventory\") pod \"ce33c619-39af-44a4-9643-a60c928971b5\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.276987 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ssh-key\") pod \"ce33c619-39af-44a4-9643-a60c928971b5\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.277243 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8hp2\" (UniqueName: \"kubernetes.io/projected/ce33c619-39af-44a4-9643-a60c928971b5-kube-api-access-z8hp2\") pod \"ce33c619-39af-44a4-9643-a60c928971b5\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.277381 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ceph\") pod \"ce33c619-39af-44a4-9643-a60c928971b5\" (UID: \"ce33c619-39af-44a4-9643-a60c928971b5\") " Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.282320 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce33c619-39af-44a4-9643-a60c928971b5-kube-api-access-z8hp2" (OuterVolumeSpecName: "kube-api-access-z8hp2") pod "ce33c619-39af-44a4-9643-a60c928971b5" (UID: "ce33c619-39af-44a4-9643-a60c928971b5"). InnerVolumeSpecName "kube-api-access-z8hp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.293652 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ceph" (OuterVolumeSpecName: "ceph") pod "ce33c619-39af-44a4-9643-a60c928971b5" (UID: "ce33c619-39af-44a4-9643-a60c928971b5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.313005 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ce33c619-39af-44a4-9643-a60c928971b5" (UID: "ce33c619-39af-44a4-9643-a60c928971b5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.321390 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-inventory" (OuterVolumeSpecName: "inventory") pod "ce33c619-39af-44a4-9643-a60c928971b5" (UID: "ce33c619-39af-44a4-9643-a60c928971b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.380002 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8hp2\" (UniqueName: \"kubernetes.io/projected/ce33c619-39af-44a4-9643-a60c928971b5-kube-api-access-z8hp2\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.380283 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.380435 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.380564 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ce33c619-39af-44a4-9643-a60c928971b5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.617523 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" event={"ID":"ce33c619-39af-44a4-9643-a60c928971b5","Type":"ContainerDied","Data":"d211c4cf704ca08634192a49dc6f284bd1221bcc2d3520bb1cc372371000ce52"} Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.617581 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d211c4cf704ca08634192a49dc6f284bd1221bcc2d3520bb1cc372371000ce52" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.617622 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-fzkl6" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.751999 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-ntld8"] Dec 09 05:21:34 crc kubenswrapper[4766]: E1209 05:21:34.752540 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce33c619-39af-44a4-9643-a60c928971b5" containerName="reboot-os-openstack-openstack-cell1" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.752559 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce33c619-39af-44a4-9643-a60c928971b5" containerName="reboot-os-openstack-openstack-cell1" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.752892 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce33c619-39af-44a4-9643-a60c928971b5" containerName="reboot-os-openstack-openstack-cell1" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.753886 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.756840 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.757035 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.757138 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.757339 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.764244 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-ntld8"] Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.792410 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.792488 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.792599 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.792665 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.792690 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.792776 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ceph\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.792931 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhrv\" (UniqueName: \"kubernetes.io/projected/85b25963-4e2f-4fa9-828a-209e27d8c39b-kube-api-access-mmhrv\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.793011 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.793059 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ssh-key\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.793096 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.793181 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.793249 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-inventory\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.895744 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhrv\" (UniqueName: \"kubernetes.io/projected/85b25963-4e2f-4fa9-828a-209e27d8c39b-kube-api-access-mmhrv\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.895820 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.895859 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ssh-key\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.895893 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.895955 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.895988 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-inventory\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.896051 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.896083 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.896136 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.896176 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.896197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.896269 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ceph\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.899999 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-inventory\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.900069 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ceph\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.900679 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.901029 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.902142 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.903390 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.903818 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.903875 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.904033 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.911931 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.913661 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ssh-key\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:34 crc kubenswrapper[4766]: I1209 05:21:34.914871 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhrv\" (UniqueName: \"kubernetes.io/projected/85b25963-4e2f-4fa9-828a-209e27d8c39b-kube-api-access-mmhrv\") pod \"install-certs-openstack-openstack-cell1-ntld8\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:35 crc kubenswrapper[4766]: I1209 05:21:35.090355 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:35 crc kubenswrapper[4766]: I1209 05:21:35.659829 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-ntld8"] Dec 09 05:21:36 crc kubenswrapper[4766]: I1209 05:21:36.643287 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-ntld8" event={"ID":"85b25963-4e2f-4fa9-828a-209e27d8c39b","Type":"ContainerStarted","Data":"78bed09ceb9b2f8c961287c095afe8bcff4df729890931406eb5ceb88de24329"} Dec 09 05:21:36 crc kubenswrapper[4766]: I1209 05:21:36.643732 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-ntld8" event={"ID":"85b25963-4e2f-4fa9-828a-209e27d8c39b","Type":"ContainerStarted","Data":"1e65c6b86d1962397bb3c3c83cd09a03d98fe32c159cb7fd6da3eb3e61dd50ea"} Dec 09 05:21:46 crc kubenswrapper[4766]: I1209 05:21:46.839393 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:21:46 crc kubenswrapper[4766]: E1209 05:21:46.840551 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:21:55 crc kubenswrapper[4766]: I1209 05:21:55.832602 4766 generic.go:334] "Generic (PLEG): container finished" podID="85b25963-4e2f-4fa9-828a-209e27d8c39b" containerID="78bed09ceb9b2f8c961287c095afe8bcff4df729890931406eb5ceb88de24329" exitCode=0 Dec 09 05:21:55 crc kubenswrapper[4766]: I1209 05:21:55.832681 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-ntld8" event={"ID":"85b25963-4e2f-4fa9-828a-209e27d8c39b","Type":"ContainerDied","Data":"78bed09ceb9b2f8c961287c095afe8bcff4df729890931406eb5ceb88de24329"} Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.316992 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358116 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-bootstrap-combined-ca-bundle\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358203 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-metadata-combined-ca-bundle\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358316 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-libvirt-combined-ca-bundle\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358471 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-sriov-combined-ca-bundle\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358497 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-nova-combined-ca-bundle\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358627 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-telemetry-combined-ca-bundle\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358701 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ssh-key\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358729 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmhrv\" (UniqueName: \"kubernetes.io/projected/85b25963-4e2f-4fa9-828a-209e27d8c39b-kube-api-access-mmhrv\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358767 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-inventory\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358798 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ceph\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358858 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-dhcp-combined-ca-bundle\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.358895 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ovn-combined-ca-bundle\") pod \"85b25963-4e2f-4fa9-828a-209e27d8c39b\" (UID: \"85b25963-4e2f-4fa9-828a-209e27d8c39b\") " Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.364101 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.364781 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.364804 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.366302 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ceph" (OuterVolumeSpecName: "ceph") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.366371 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.366512 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.367691 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.368114 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.368691 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.368807 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b25963-4e2f-4fa9-828a-209e27d8c39b-kube-api-access-mmhrv" (OuterVolumeSpecName: "kube-api-access-mmhrv") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "kube-api-access-mmhrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.391348 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-inventory" (OuterVolumeSpecName: "inventory") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.393607 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "85b25963-4e2f-4fa9-828a-209e27d8c39b" (UID: "85b25963-4e2f-4fa9-828a-209e27d8c39b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462075 4766 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462455 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462468 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmhrv\" (UniqueName: \"kubernetes.io/projected/85b25963-4e2f-4fa9-828a-209e27d8c39b-kube-api-access-mmhrv\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462484 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462496 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462507 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462523 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462534 4766 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462548 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462561 4766 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462572 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.462585 4766 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b25963-4e2f-4fa9-828a-209e27d8c39b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.840573 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:21:57 crc kubenswrapper[4766]: E1209 05:21:57.840821 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.869550 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-ntld8" event={"ID":"85b25963-4e2f-4fa9-828a-209e27d8c39b","Type":"ContainerDied","Data":"1e65c6b86d1962397bb3c3c83cd09a03d98fe32c159cb7fd6da3eb3e61dd50ea"} Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.869592 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e65c6b86d1962397bb3c3c83cd09a03d98fe32c159cb7fd6da3eb3e61dd50ea" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.869639 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-ntld8" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.955369 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-vg2pg"] Dec 09 05:21:57 crc kubenswrapper[4766]: E1209 05:21:57.955854 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b25963-4e2f-4fa9-828a-209e27d8c39b" containerName="install-certs-openstack-openstack-cell1" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.955872 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b25963-4e2f-4fa9-828a-209e27d8c39b" containerName="install-certs-openstack-openstack-cell1" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.956101 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b25963-4e2f-4fa9-828a-209e27d8c39b" containerName="install-certs-openstack-openstack-cell1" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.958179 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.960682 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.961005 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.966408 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-vg2pg"] Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.973611 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:21:57 crc kubenswrapper[4766]: I1209 05:21:57.973988 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.076782 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-inventory\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.076858 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.076888 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxb2f\" (UniqueName: \"kubernetes.io/projected/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-kube-api-access-rxb2f\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.078006 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ceph\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.179976 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-inventory\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.180069 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.180115 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxb2f\" (UniqueName: \"kubernetes.io/projected/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-kube-api-access-rxb2f\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.180242 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ceph\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.184180 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ceph\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.184492 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.193633 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-inventory\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.199274 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxb2f\" (UniqueName: \"kubernetes.io/projected/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-kube-api-access-rxb2f\") pod \"ceph-client-openstack-openstack-cell1-vg2pg\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.293107 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.879010 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-vg2pg"] Dec 09 05:21:58 crc kubenswrapper[4766]: I1209 05:21:58.894441 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" event={"ID":"8e847b86-2a1f-426f-bf4f-c7739ce8b65b","Type":"ContainerStarted","Data":"bd601fcaa1151c5d9a69994c569bc03b8b80159ca5136563cf9413b66fe0ffae"} Dec 09 05:21:59 crc kubenswrapper[4766]: I1209 05:21:59.073836 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:21:59 crc kubenswrapper[4766]: I1209 05:21:59.906841 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" event={"ID":"8e847b86-2a1f-426f-bf4f-c7739ce8b65b","Type":"ContainerStarted","Data":"1903c51a0e231466d7e7a61374e453b5b6642d5983e61429b3a6c2ef2b0dcff8"} Dec 09 05:21:59 crc kubenswrapper[4766]: I1209 05:21:59.928352 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" podStartSLOduration=2.735631777 podStartE2EDuration="2.928335336s" podCreationTimestamp="2025-12-09 05:21:57 +0000 UTC" firstStartedPulling="2025-12-09 05:21:58.878147968 +0000 UTC m=+7800.587453394" lastFinishedPulling="2025-12-09 05:21:59.070851537 +0000 UTC m=+7800.780156953" observedRunningTime="2025-12-09 05:21:59.926805284 +0000 UTC m=+7801.636110720" watchObservedRunningTime="2025-12-09 05:21:59.928335336 +0000 UTC m=+7801.637640762" Dec 09 05:22:04 crc kubenswrapper[4766]: I1209 05:22:04.952130 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e847b86-2a1f-426f-bf4f-c7739ce8b65b" containerID="1903c51a0e231466d7e7a61374e453b5b6642d5983e61429b3a6c2ef2b0dcff8" exitCode=0 Dec 09 05:22:04 crc kubenswrapper[4766]: I1209 05:22:04.952263 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" event={"ID":"8e847b86-2a1f-426f-bf4f-c7739ce8b65b","Type":"ContainerDied","Data":"1903c51a0e231466d7e7a61374e453b5b6642d5983e61429b3a6c2ef2b0dcff8"} Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.404271 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.555797 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ssh-key\") pod \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.556064 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-inventory\") pod \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.556128 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxb2f\" (UniqueName: \"kubernetes.io/projected/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-kube-api-access-rxb2f\") pod \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.556197 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ceph\") pod \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\" (UID: \"8e847b86-2a1f-426f-bf4f-c7739ce8b65b\") " Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.561299 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ceph" (OuterVolumeSpecName: "ceph") pod "8e847b86-2a1f-426f-bf4f-c7739ce8b65b" (UID: "8e847b86-2a1f-426f-bf4f-c7739ce8b65b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.566474 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-kube-api-access-rxb2f" (OuterVolumeSpecName: "kube-api-access-rxb2f") pod "8e847b86-2a1f-426f-bf4f-c7739ce8b65b" (UID: "8e847b86-2a1f-426f-bf4f-c7739ce8b65b"). InnerVolumeSpecName "kube-api-access-rxb2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.591047 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-inventory" (OuterVolumeSpecName: "inventory") pod "8e847b86-2a1f-426f-bf4f-c7739ce8b65b" (UID: "8e847b86-2a1f-426f-bf4f-c7739ce8b65b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.603393 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8e847b86-2a1f-426f-bf4f-c7739ce8b65b" (UID: "8e847b86-2a1f-426f-bf4f-c7739ce8b65b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.658709 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.658756 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxb2f\" (UniqueName: \"kubernetes.io/projected/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-kube-api-access-rxb2f\") on node \"crc\" DevicePath \"\"" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.658775 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.658790 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e847b86-2a1f-426f-bf4f-c7739ce8b65b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.969950 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" event={"ID":"8e847b86-2a1f-426f-bf4f-c7739ce8b65b","Type":"ContainerDied","Data":"bd601fcaa1151c5d9a69994c569bc03b8b80159ca5136563cf9413b66fe0ffae"} Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.969987 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd601fcaa1151c5d9a69994c569bc03b8b80159ca5136563cf9413b66fe0ffae" Dec 09 05:22:06 crc kubenswrapper[4766]: I1209 05:22:06.970009 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vg2pg" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.043315 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-vk8d7"] Dec 09 05:22:07 crc kubenswrapper[4766]: E1209 05:22:07.043796 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e847b86-2a1f-426f-bf4f-c7739ce8b65b" containerName="ceph-client-openstack-openstack-cell1" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.043812 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e847b86-2a1f-426f-bf4f-c7739ce8b65b" containerName="ceph-client-openstack-openstack-cell1" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.044056 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e847b86-2a1f-426f-bf4f-c7739ce8b65b" containerName="ceph-client-openstack-openstack-cell1" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.044856 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.047089 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.047297 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.047421 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.047793 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.049336 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.067103 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbc4g\" (UniqueName: \"kubernetes.io/projected/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-kube-api-access-lbc4g\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.067353 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ceph\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.067445 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.067507 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-inventory\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.067691 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.067736 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ssh-key\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.068715 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-vk8d7"] Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.169873 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbc4g\" (UniqueName: \"kubernetes.io/projected/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-kube-api-access-lbc4g\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.169962 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ceph\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.170005 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.170039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-inventory\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.170110 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.170132 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ssh-key\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.172181 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.175558 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ssh-key\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.176988 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.177413 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-inventory\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.179357 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ceph\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.187245 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbc4g\" (UniqueName: \"kubernetes.io/projected/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-kube-api-access-lbc4g\") pod \"ovn-openstack-openstack-cell1-vk8d7\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.363344 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.944286 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-vk8d7"] Dec 09 05:22:07 crc kubenswrapper[4766]: I1209 05:22:07.979989 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-vk8d7" event={"ID":"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f","Type":"ContainerStarted","Data":"10af09e5b4d18cc3fa62ecfb328f4b53c68a1e975eca2d31990d5d6c43a9ada9"} Dec 09 05:22:08 crc kubenswrapper[4766]: I1209 05:22:08.990610 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-vk8d7" event={"ID":"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f","Type":"ContainerStarted","Data":"8c0dfb15eb8208cd4e0bf125c94c4e6c94e59e3c7112eeb8b297b24864419ad6"} Dec 09 05:22:09 crc kubenswrapper[4766]: I1209 05:22:09.015692 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-vk8d7" podStartSLOduration=1.82744018 podStartE2EDuration="2.015677617s" podCreationTimestamp="2025-12-09 05:22:07 +0000 UTC" firstStartedPulling="2025-12-09 05:22:07.942827516 +0000 UTC m=+7809.652132962" lastFinishedPulling="2025-12-09 05:22:08.131064973 +0000 UTC m=+7809.840370399" observedRunningTime="2025-12-09 05:22:09.015159273 +0000 UTC m=+7810.724464709" watchObservedRunningTime="2025-12-09 05:22:09.015677617 +0000 UTC m=+7810.724983043" Dec 09 05:22:11 crc kubenswrapper[4766]: I1209 05:22:11.841184 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:22:11 crc kubenswrapper[4766]: E1209 05:22:11.841835 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:22:22 crc kubenswrapper[4766]: I1209 05:22:22.839479 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:22:22 crc kubenswrapper[4766]: E1209 05:22:22.840589 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:22:33 crc kubenswrapper[4766]: I1209 05:22:33.841091 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:22:33 crc kubenswrapper[4766]: E1209 05:22:33.842406 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:22:47 crc kubenswrapper[4766]: I1209 05:22:47.839967 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:22:47 crc kubenswrapper[4766]: E1209 05:22:47.840951 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:23:00 crc kubenswrapper[4766]: I1209 05:23:00.841172 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:23:00 crc kubenswrapper[4766]: E1209 05:23:00.842917 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:23:11 crc kubenswrapper[4766]: I1209 05:23:11.839881 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:23:11 crc kubenswrapper[4766]: E1209 05:23:11.840562 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:23:17 crc kubenswrapper[4766]: I1209 05:23:17.899900 4766 generic.go:334] "Generic (PLEG): container finished" podID="1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" containerID="8c0dfb15eb8208cd4e0bf125c94c4e6c94e59e3c7112eeb8b297b24864419ad6" exitCode=0 Dec 09 05:23:17 crc kubenswrapper[4766]: I1209 05:23:17.899958 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-vk8d7" event={"ID":"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f","Type":"ContainerDied","Data":"8c0dfb15eb8208cd4e0bf125c94c4e6c94e59e3c7112eeb8b297b24864419ad6"} Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.387123 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.559929 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbc4g\" (UniqueName: \"kubernetes.io/projected/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-kube-api-access-lbc4g\") pod \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.560237 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovncontroller-config-0\") pod \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.560302 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ceph\") pod \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.560326 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-inventory\") pod \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.560371 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ssh-key\") pod \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.560391 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovn-combined-ca-bundle\") pod \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\" (UID: \"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f\") " Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.566517 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ceph" (OuterVolumeSpecName: "ceph") pod "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" (UID: "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.566935 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-kube-api-access-lbc4g" (OuterVolumeSpecName: "kube-api-access-lbc4g") pod "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" (UID: "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f"). InnerVolumeSpecName "kube-api-access-lbc4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.567711 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" (UID: "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.596669 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" (UID: "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.599575 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-inventory" (OuterVolumeSpecName: "inventory") pod "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" (UID: "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.611347 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" (UID: "1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.662819 4766 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.662860 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.662876 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.662888 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.662901 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.662914 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbc4g\" (UniqueName: \"kubernetes.io/projected/1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f-kube-api-access-lbc4g\") on node \"crc\" DevicePath \"\"" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.926981 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-vk8d7" event={"ID":"1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f","Type":"ContainerDied","Data":"10af09e5b4d18cc3fa62ecfb328f4b53c68a1e975eca2d31990d5d6c43a9ada9"} Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.927052 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10af09e5b4d18cc3fa62ecfb328f4b53c68a1e975eca2d31990d5d6c43a9ada9" Dec 09 05:23:19 crc kubenswrapper[4766]: I1209 05:23:19.927071 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-vk8d7" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.060952 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-gkrmn"] Dec 09 05:23:20 crc kubenswrapper[4766]: E1209 05:23:20.061619 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" containerName="ovn-openstack-openstack-cell1" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.061651 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" containerName="ovn-openstack-openstack-cell1" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.061941 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f" containerName="ovn-openstack-openstack-cell1" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.062882 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.066059 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.066254 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.066474 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.066917 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.067286 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.068812 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.074856 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-gkrmn"] Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.173863 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ct2v\" (UniqueName: \"kubernetes.io/projected/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-kube-api-access-9ct2v\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.174290 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.174374 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.174493 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.174758 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.174822 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.174858 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.277031 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.277161 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.277588 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.277697 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.278312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.278489 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ct2v\" (UniqueName: \"kubernetes.io/projected/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-kube-api-access-9ct2v\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.278600 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.282358 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.283624 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.284038 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.284481 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.285297 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.286529 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.301264 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ct2v\" (UniqueName: \"kubernetes.io/projected/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-kube-api-access-9ct2v\") pod \"neutron-metadata-openstack-openstack-cell1-gkrmn\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.398234 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.975243 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-gkrmn"] Dec 09 05:23:20 crc kubenswrapper[4766]: I1209 05:23:20.987595 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:23:21 crc kubenswrapper[4766]: I1209 05:23:21.945082 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" event={"ID":"229ed8f5-5e3b-4f6b-8dbb-860af1016c20","Type":"ContainerStarted","Data":"8e3001d06dc877360597b007d0df240d4f83f3ce3a9646f2a2f3fbed1167c3bd"} Dec 09 05:23:21 crc kubenswrapper[4766]: I1209 05:23:21.945421 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" event={"ID":"229ed8f5-5e3b-4f6b-8dbb-860af1016c20","Type":"ContainerStarted","Data":"0b216f0ece303e8d6437b8922ee0bbb64348b1830494689db5f031cb1b2e078d"} Dec 09 05:23:21 crc kubenswrapper[4766]: I1209 05:23:21.962041 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" podStartSLOduration=1.7973007600000002 podStartE2EDuration="1.962022901s" podCreationTimestamp="2025-12-09 05:23:20 +0000 UTC" firstStartedPulling="2025-12-09 05:23:20.987127982 +0000 UTC m=+7882.696433428" lastFinishedPulling="2025-12-09 05:23:21.151850143 +0000 UTC m=+7882.861155569" observedRunningTime="2025-12-09 05:23:21.957766396 +0000 UTC m=+7883.667071832" watchObservedRunningTime="2025-12-09 05:23:21.962022901 +0000 UTC m=+7883.671328327" Dec 09 05:23:26 crc kubenswrapper[4766]: I1209 05:23:26.839571 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:23:26 crc kubenswrapper[4766]: E1209 05:23:26.840422 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:23:37 crc kubenswrapper[4766]: I1209 05:23:37.840905 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:23:37 crc kubenswrapper[4766]: E1209 05:23:37.842343 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:23:48 crc kubenswrapper[4766]: I1209 05:23:48.840386 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:23:48 crc kubenswrapper[4766]: E1209 05:23:48.841189 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:24:03 crc kubenswrapper[4766]: I1209 05:24:03.838660 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:24:03 crc kubenswrapper[4766]: E1209 05:24:03.839528 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:24:17 crc kubenswrapper[4766]: I1209 05:24:17.586531 4766 generic.go:334] "Generic (PLEG): container finished" podID="229ed8f5-5e3b-4f6b-8dbb-860af1016c20" containerID="8e3001d06dc877360597b007d0df240d4f83f3ce3a9646f2a2f3fbed1167c3bd" exitCode=0 Dec 09 05:24:17 crc kubenswrapper[4766]: I1209 05:24:17.586790 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" event={"ID":"229ed8f5-5e3b-4f6b-8dbb-860af1016c20","Type":"ContainerDied","Data":"8e3001d06dc877360597b007d0df240d4f83f3ce3a9646f2a2f3fbed1167c3bd"} Dec 09 05:24:17 crc kubenswrapper[4766]: I1209 05:24:17.839395 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:24:17 crc kubenswrapper[4766]: E1209 05:24:17.839803 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.042041 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.102404 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ct2v\" (UniqueName: \"kubernetes.io/projected/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-kube-api-access-9ct2v\") pod \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.102494 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-nova-metadata-neutron-config-0\") pod \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.102595 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ceph\") pod \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.102772 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-inventory\") pod \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.102836 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ssh-key\") pod \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.102991 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-metadata-combined-ca-bundle\") pod \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.103031 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-ovn-metadata-agent-neutron-config-0\") pod \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\" (UID: \"229ed8f5-5e3b-4f6b-8dbb-860af1016c20\") " Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.108232 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ceph" (OuterVolumeSpecName: "ceph") pod "229ed8f5-5e3b-4f6b-8dbb-860af1016c20" (UID: "229ed8f5-5e3b-4f6b-8dbb-860af1016c20"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.108543 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "229ed8f5-5e3b-4f6b-8dbb-860af1016c20" (UID: "229ed8f5-5e3b-4f6b-8dbb-860af1016c20"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.108631 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-kube-api-access-9ct2v" (OuterVolumeSpecName: "kube-api-access-9ct2v") pod "229ed8f5-5e3b-4f6b-8dbb-860af1016c20" (UID: "229ed8f5-5e3b-4f6b-8dbb-860af1016c20"). InnerVolumeSpecName "kube-api-access-9ct2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.142474 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "229ed8f5-5e3b-4f6b-8dbb-860af1016c20" (UID: "229ed8f5-5e3b-4f6b-8dbb-860af1016c20"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.144185 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "229ed8f5-5e3b-4f6b-8dbb-860af1016c20" (UID: "229ed8f5-5e3b-4f6b-8dbb-860af1016c20"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.147861 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-inventory" (OuterVolumeSpecName: "inventory") pod "229ed8f5-5e3b-4f6b-8dbb-860af1016c20" (UID: "229ed8f5-5e3b-4f6b-8dbb-860af1016c20"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.151426 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "229ed8f5-5e3b-4f6b-8dbb-860af1016c20" (UID: "229ed8f5-5e3b-4f6b-8dbb-860af1016c20"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.205583 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ct2v\" (UniqueName: \"kubernetes.io/projected/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-kube-api-access-9ct2v\") on node \"crc\" DevicePath \"\"" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.205627 4766 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.205642 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.205657 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.205670 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.205681 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.205693 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/229ed8f5-5e3b-4f6b-8dbb-860af1016c20-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.606996 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" event={"ID":"229ed8f5-5e3b-4f6b-8dbb-860af1016c20","Type":"ContainerDied","Data":"0b216f0ece303e8d6437b8922ee0bbb64348b1830494689db5f031cb1b2e078d"} Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.607273 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b216f0ece303e8d6437b8922ee0bbb64348b1830494689db5f031cb1b2e078d" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.607075 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-gkrmn" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.833883 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-r2jnr"] Dec 09 05:24:19 crc kubenswrapper[4766]: E1209 05:24:19.834496 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229ed8f5-5e3b-4f6b-8dbb-860af1016c20" containerName="neutron-metadata-openstack-openstack-cell1" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.834521 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="229ed8f5-5e3b-4f6b-8dbb-860af1016c20" containerName="neutron-metadata-openstack-openstack-cell1" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.834784 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="229ed8f5-5e3b-4f6b-8dbb-860af1016c20" containerName="neutron-metadata-openstack-openstack-cell1" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.835732 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.846730 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.846746 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.846944 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.847026 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.847409 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.848757 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-r2jnr"] Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.919735 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ceph\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.919804 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ssh-key\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.920035 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.920155 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlcf2\" (UniqueName: \"kubernetes.io/projected/0ef78455-720c-4e11-b479-4f665656e20b-kube-api-access-xlcf2\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.920247 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-inventory\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:19 crc kubenswrapper[4766]: I1209 05:24:19.920509 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.022961 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.023124 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ceph\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.023163 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ssh-key\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.023260 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.023322 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlcf2\" (UniqueName: \"kubernetes.io/projected/0ef78455-720c-4e11-b479-4f665656e20b-kube-api-access-xlcf2\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.023364 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-inventory\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.029173 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-inventory\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.029643 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.030096 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ceph\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.031249 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.038728 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ssh-key\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.040035 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlcf2\" (UniqueName: \"kubernetes.io/projected/0ef78455-720c-4e11-b479-4f665656e20b-kube-api-access-xlcf2\") pod \"libvirt-openstack-openstack-cell1-r2jnr\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.164641 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:24:20 crc kubenswrapper[4766]: I1209 05:24:20.890025 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-r2jnr"] Dec 09 05:24:21 crc kubenswrapper[4766]: I1209 05:24:21.633888 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" event={"ID":"0ef78455-720c-4e11-b479-4f665656e20b","Type":"ContainerStarted","Data":"5e8bfd621b4098b6b13429e612611832e0dab855d5990fd3616b840e3fd721ea"} Dec 09 05:24:21 crc kubenswrapper[4766]: I1209 05:24:21.634246 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" event={"ID":"0ef78455-720c-4e11-b479-4f665656e20b","Type":"ContainerStarted","Data":"997352010bda507d8b3c937dad0a9e83e110abd47c29b106388ecbc08fa52c3a"} Dec 09 05:24:21 crc kubenswrapper[4766]: I1209 05:24:21.653854 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" podStartSLOduration=2.507352847 podStartE2EDuration="2.653838964s" podCreationTimestamp="2025-12-09 05:24:19 +0000 UTC" firstStartedPulling="2025-12-09 05:24:20.900502074 +0000 UTC m=+7942.609807500" lastFinishedPulling="2025-12-09 05:24:21.046988191 +0000 UTC m=+7942.756293617" observedRunningTime="2025-12-09 05:24:21.652721744 +0000 UTC m=+7943.362027160" watchObservedRunningTime="2025-12-09 05:24:21.653838964 +0000 UTC m=+7943.363144390" Dec 09 05:24:29 crc kubenswrapper[4766]: I1209 05:24:29.839443 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:24:29 crc kubenswrapper[4766]: E1209 05:24:29.840298 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:24:41 crc kubenswrapper[4766]: I1209 05:24:41.840391 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:24:41 crc kubenswrapper[4766]: E1209 05:24:41.841841 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:24:53 crc kubenswrapper[4766]: I1209 05:24:53.840619 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:24:53 crc kubenswrapper[4766]: E1209 05:24:53.841648 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:25:08 crc kubenswrapper[4766]: I1209 05:25:08.847451 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:25:08 crc kubenswrapper[4766]: E1209 05:25:08.849510 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:25:22 crc kubenswrapper[4766]: I1209 05:25:22.839818 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:25:22 crc kubenswrapper[4766]: E1209 05:25:22.840762 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:25:33 crc kubenswrapper[4766]: I1209 05:25:33.839562 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:25:33 crc kubenswrapper[4766]: E1209 05:25:33.840463 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:25:44 crc kubenswrapper[4766]: I1209 05:25:44.839467 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:25:44 crc kubenswrapper[4766]: E1209 05:25:44.840194 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.272867 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q9h2x"] Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.275842 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.289178 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9h2x"] Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.449882 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-utilities\") pod \"redhat-marketplace-q9h2x\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.450365 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qjcx\" (UniqueName: \"kubernetes.io/projected/a75f80f2-35a6-4c59-966a-db46c9b41131-kube-api-access-8qjcx\") pod \"redhat-marketplace-q9h2x\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.450550 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-catalog-content\") pod \"redhat-marketplace-q9h2x\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.552011 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qjcx\" (UniqueName: \"kubernetes.io/projected/a75f80f2-35a6-4c59-966a-db46c9b41131-kube-api-access-8qjcx\") pod \"redhat-marketplace-q9h2x\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.552081 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-catalog-content\") pod \"redhat-marketplace-q9h2x\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.552121 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-utilities\") pod \"redhat-marketplace-q9h2x\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.552651 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-utilities\") pod \"redhat-marketplace-q9h2x\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.552746 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-catalog-content\") pod \"redhat-marketplace-q9h2x\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.580409 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qjcx\" (UniqueName: \"kubernetes.io/projected/a75f80f2-35a6-4c59-966a-db46c9b41131-kube-api-access-8qjcx\") pod \"redhat-marketplace-q9h2x\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:45 crc kubenswrapper[4766]: I1209 05:25:45.643608 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:46 crc kubenswrapper[4766]: I1209 05:25:46.130797 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9h2x"] Dec 09 05:25:46 crc kubenswrapper[4766]: I1209 05:25:46.575394 4766 generic.go:334] "Generic (PLEG): container finished" podID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerID="56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762" exitCode=0 Dec 09 05:25:46 crc kubenswrapper[4766]: I1209 05:25:46.575474 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9h2x" event={"ID":"a75f80f2-35a6-4c59-966a-db46c9b41131","Type":"ContainerDied","Data":"56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762"} Dec 09 05:25:46 crc kubenswrapper[4766]: I1209 05:25:46.575757 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9h2x" event={"ID":"a75f80f2-35a6-4c59-966a-db46c9b41131","Type":"ContainerStarted","Data":"e277b9deeb7f01e8c2124cf79ef454b5efb5a99bfc6aa48d6dd87f6642eb0f75"} Dec 09 05:25:47 crc kubenswrapper[4766]: I1209 05:25:47.588982 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9h2x" event={"ID":"a75f80f2-35a6-4c59-966a-db46c9b41131","Type":"ContainerStarted","Data":"ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96"} Dec 09 05:25:48 crc kubenswrapper[4766]: I1209 05:25:48.615025 4766 generic.go:334] "Generic (PLEG): container finished" podID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerID="ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96" exitCode=0 Dec 09 05:25:48 crc kubenswrapper[4766]: I1209 05:25:48.615073 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9h2x" event={"ID":"a75f80f2-35a6-4c59-966a-db46c9b41131","Type":"ContainerDied","Data":"ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96"} Dec 09 05:25:49 crc kubenswrapper[4766]: I1209 05:25:49.631485 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9h2x" event={"ID":"a75f80f2-35a6-4c59-966a-db46c9b41131","Type":"ContainerStarted","Data":"bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f"} Dec 09 05:25:49 crc kubenswrapper[4766]: I1209 05:25:49.654896 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q9h2x" podStartSLOduration=2.209413498 podStartE2EDuration="4.654878407s" podCreationTimestamp="2025-12-09 05:25:45 +0000 UTC" firstStartedPulling="2025-12-09 05:25:46.578973227 +0000 UTC m=+8028.288278653" lastFinishedPulling="2025-12-09 05:25:49.024438106 +0000 UTC m=+8030.733743562" observedRunningTime="2025-12-09 05:25:49.65272133 +0000 UTC m=+8031.362026756" watchObservedRunningTime="2025-12-09 05:25:49.654878407 +0000 UTC m=+8031.364183843" Dec 09 05:25:55 crc kubenswrapper[4766]: I1209 05:25:55.644080 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:55 crc kubenswrapper[4766]: I1209 05:25:55.644665 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:55 crc kubenswrapper[4766]: I1209 05:25:55.725627 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:55 crc kubenswrapper[4766]: I1209 05:25:55.772361 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:55 crc kubenswrapper[4766]: I1209 05:25:55.839037 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:25:55 crc kubenswrapper[4766]: E1209 05:25:55.839447 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:25:55 crc kubenswrapper[4766]: I1209 05:25:55.985107 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9h2x"] Dec 09 05:25:57 crc kubenswrapper[4766]: I1209 05:25:57.726353 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q9h2x" podUID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerName="registry-server" containerID="cri-o://bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f" gracePeriod=2 Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.288727 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.438625 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-utilities\") pod \"a75f80f2-35a6-4c59-966a-db46c9b41131\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.438719 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-catalog-content\") pod \"a75f80f2-35a6-4c59-966a-db46c9b41131\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.438973 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qjcx\" (UniqueName: \"kubernetes.io/projected/a75f80f2-35a6-4c59-966a-db46c9b41131-kube-api-access-8qjcx\") pod \"a75f80f2-35a6-4c59-966a-db46c9b41131\" (UID: \"a75f80f2-35a6-4c59-966a-db46c9b41131\") " Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.446057 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75f80f2-35a6-4c59-966a-db46c9b41131-kube-api-access-8qjcx" (OuterVolumeSpecName: "kube-api-access-8qjcx") pod "a75f80f2-35a6-4c59-966a-db46c9b41131" (UID: "a75f80f2-35a6-4c59-966a-db46c9b41131"). InnerVolumeSpecName "kube-api-access-8qjcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.454143 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-utilities" (OuterVolumeSpecName: "utilities") pod "a75f80f2-35a6-4c59-966a-db46c9b41131" (UID: "a75f80f2-35a6-4c59-966a-db46c9b41131"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.472661 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a75f80f2-35a6-4c59-966a-db46c9b41131" (UID: "a75f80f2-35a6-4c59-966a-db46c9b41131"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.541701 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qjcx\" (UniqueName: \"kubernetes.io/projected/a75f80f2-35a6-4c59-966a-db46c9b41131-kube-api-access-8qjcx\") on node \"crc\" DevicePath \"\"" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.542023 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.542123 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75f80f2-35a6-4c59-966a-db46c9b41131-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.736701 4766 generic.go:334] "Generic (PLEG): container finished" podID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerID="bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f" exitCode=0 Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.736749 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9h2x" event={"ID":"a75f80f2-35a6-4c59-966a-db46c9b41131","Type":"ContainerDied","Data":"bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f"} Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.736757 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9h2x" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.736793 4766 scope.go:117] "RemoveContainer" containerID="bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.736778 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9h2x" event={"ID":"a75f80f2-35a6-4c59-966a-db46c9b41131","Type":"ContainerDied","Data":"e277b9deeb7f01e8c2124cf79ef454b5efb5a99bfc6aa48d6dd87f6642eb0f75"} Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.782332 4766 scope.go:117] "RemoveContainer" containerID="ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.794509 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9h2x"] Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.813377 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9h2x"] Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.818324 4766 scope.go:117] "RemoveContainer" containerID="56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.861742 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75f80f2-35a6-4c59-966a-db46c9b41131" path="/var/lib/kubelet/pods/a75f80f2-35a6-4c59-966a-db46c9b41131/volumes" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.897728 4766 scope.go:117] "RemoveContainer" containerID="bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f" Dec 09 05:25:58 crc kubenswrapper[4766]: E1209 05:25:58.899577 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f\": container with ID starting with bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f not found: ID does not exist" containerID="bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.899643 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f"} err="failed to get container status \"bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f\": rpc error: code = NotFound desc = could not find container \"bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f\": container with ID starting with bee4118532c0bb57ccd4604dd395a761273ec10d9af248255ad23a7c8bfb1e0f not found: ID does not exist" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.899695 4766 scope.go:117] "RemoveContainer" containerID="ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96" Dec 09 05:25:58 crc kubenswrapper[4766]: E1209 05:25:58.900120 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96\": container with ID starting with ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96 not found: ID does not exist" containerID="ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.900147 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96"} err="failed to get container status \"ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96\": rpc error: code = NotFound desc = could not find container \"ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96\": container with ID starting with ad791e62e8ab49396c6560c7b22afe10200ff96d3e1810d64da7b5812e520b96 not found: ID does not exist" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.900163 4766 scope.go:117] "RemoveContainer" containerID="56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762" Dec 09 05:25:58 crc kubenswrapper[4766]: E1209 05:25:58.900603 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762\": container with ID starting with 56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762 not found: ID does not exist" containerID="56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762" Dec 09 05:25:58 crc kubenswrapper[4766]: I1209 05:25:58.900663 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762"} err="failed to get container status \"56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762\": rpc error: code = NotFound desc = could not find container \"56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762\": container with ID starting with 56a5ab3df2d6dc820a124304d4d5d1ccbb36d0fa1f1212596f4d8cc4bab7f762 not found: ID does not exist" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.384816 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vk5gv"] Dec 09 05:26:03 crc kubenswrapper[4766]: E1209 05:26:03.387586 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerName="registry-server" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.387720 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerName="registry-server" Dec 09 05:26:03 crc kubenswrapper[4766]: E1209 05:26:03.387883 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerName="extract-utilities" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.387993 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerName="extract-utilities" Dec 09 05:26:03 crc kubenswrapper[4766]: E1209 05:26:03.388088 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerName="extract-content" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.388173 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerName="extract-content" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.388617 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75f80f2-35a6-4c59-966a-db46c9b41131" containerName="registry-server" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.390851 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.422817 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vk5gv"] Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.549782 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-utilities\") pod \"redhat-operators-vk5gv\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.549857 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7zlq\" (UniqueName: \"kubernetes.io/projected/54cc491c-5fc0-413d-854f-5dccac097c86-kube-api-access-x7zlq\") pod \"redhat-operators-vk5gv\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.549919 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-catalog-content\") pod \"redhat-operators-vk5gv\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.652110 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7zlq\" (UniqueName: \"kubernetes.io/projected/54cc491c-5fc0-413d-854f-5dccac097c86-kube-api-access-x7zlq\") pod \"redhat-operators-vk5gv\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.652199 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-catalog-content\") pod \"redhat-operators-vk5gv\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.652368 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-utilities\") pod \"redhat-operators-vk5gv\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.652856 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-catalog-content\") pod \"redhat-operators-vk5gv\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.652938 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-utilities\") pod \"redhat-operators-vk5gv\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.684698 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7zlq\" (UniqueName: \"kubernetes.io/projected/54cc491c-5fc0-413d-854f-5dccac097c86-kube-api-access-x7zlq\") pod \"redhat-operators-vk5gv\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:03 crc kubenswrapper[4766]: I1209 05:26:03.713334 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:04 crc kubenswrapper[4766]: I1209 05:26:04.221300 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vk5gv"] Dec 09 05:26:04 crc kubenswrapper[4766]: I1209 05:26:04.816080 4766 generic.go:334] "Generic (PLEG): container finished" podID="54cc491c-5fc0-413d-854f-5dccac097c86" containerID="0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576" exitCode=0 Dec 09 05:26:04 crc kubenswrapper[4766]: I1209 05:26:04.816138 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk5gv" event={"ID":"54cc491c-5fc0-413d-854f-5dccac097c86","Type":"ContainerDied","Data":"0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576"} Dec 09 05:26:04 crc kubenswrapper[4766]: I1209 05:26:04.816414 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk5gv" event={"ID":"54cc491c-5fc0-413d-854f-5dccac097c86","Type":"ContainerStarted","Data":"db4f3650f23d3cca7f4ff21907d74e76535bb229af7a1eb530fc78b6880e4085"} Dec 09 05:26:06 crc kubenswrapper[4766]: I1209 05:26:06.840960 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:26:06 crc kubenswrapper[4766]: E1209 05:26:06.841687 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:26:06 crc kubenswrapper[4766]: I1209 05:26:06.855952 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk5gv" event={"ID":"54cc491c-5fc0-413d-854f-5dccac097c86","Type":"ContainerStarted","Data":"895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727"} Dec 09 05:26:08 crc kubenswrapper[4766]: I1209 05:26:08.881502 4766 generic.go:334] "Generic (PLEG): container finished" podID="54cc491c-5fc0-413d-854f-5dccac097c86" containerID="895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727" exitCode=0 Dec 09 05:26:08 crc kubenswrapper[4766]: I1209 05:26:08.881609 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk5gv" event={"ID":"54cc491c-5fc0-413d-854f-5dccac097c86","Type":"ContainerDied","Data":"895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727"} Dec 09 05:26:09 crc kubenswrapper[4766]: I1209 05:26:09.892384 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk5gv" event={"ID":"54cc491c-5fc0-413d-854f-5dccac097c86","Type":"ContainerStarted","Data":"dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24"} Dec 09 05:26:09 crc kubenswrapper[4766]: I1209 05:26:09.908912 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vk5gv" podStartSLOduration=2.261088209 podStartE2EDuration="6.908892004s" podCreationTimestamp="2025-12-09 05:26:03 +0000 UTC" firstStartedPulling="2025-12-09 05:26:04.82127904 +0000 UTC m=+8046.530584456" lastFinishedPulling="2025-12-09 05:26:09.469082825 +0000 UTC m=+8051.178388251" observedRunningTime="2025-12-09 05:26:09.906282344 +0000 UTC m=+8051.615587780" watchObservedRunningTime="2025-12-09 05:26:09.908892004 +0000 UTC m=+8051.618197430" Dec 09 05:26:13 crc kubenswrapper[4766]: I1209 05:26:13.713461 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:13 crc kubenswrapper[4766]: I1209 05:26:13.714030 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:14 crc kubenswrapper[4766]: I1209 05:26:14.784644 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vk5gv" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" containerName="registry-server" probeResult="failure" output=< Dec 09 05:26:14 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:26:14 crc kubenswrapper[4766]: > Dec 09 05:26:17 crc kubenswrapper[4766]: I1209 05:26:17.839303 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:26:18 crc kubenswrapper[4766]: I1209 05:26:18.984252 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"0115129135ca679d8e3aa30f2321dde0fc2dc2a485d2fb80532fe4fd9d7c79b2"} Dec 09 05:26:23 crc kubenswrapper[4766]: I1209 05:26:23.771116 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:23 crc kubenswrapper[4766]: I1209 05:26:23.826781 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:24 crc kubenswrapper[4766]: I1209 05:26:24.013913 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vk5gv"] Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.047143 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vk5gv" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" containerName="registry-server" containerID="cri-o://dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24" gracePeriod=2 Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.563207 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.589623 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-catalog-content\") pod \"54cc491c-5fc0-413d-854f-5dccac097c86\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.589776 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-utilities\") pod \"54cc491c-5fc0-413d-854f-5dccac097c86\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.589859 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zlq\" (UniqueName: \"kubernetes.io/projected/54cc491c-5fc0-413d-854f-5dccac097c86-kube-api-access-x7zlq\") pod \"54cc491c-5fc0-413d-854f-5dccac097c86\" (UID: \"54cc491c-5fc0-413d-854f-5dccac097c86\") " Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.590800 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-utilities" (OuterVolumeSpecName: "utilities") pod "54cc491c-5fc0-413d-854f-5dccac097c86" (UID: "54cc491c-5fc0-413d-854f-5dccac097c86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.596412 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cc491c-5fc0-413d-854f-5dccac097c86-kube-api-access-x7zlq" (OuterVolumeSpecName: "kube-api-access-x7zlq") pod "54cc491c-5fc0-413d-854f-5dccac097c86" (UID: "54cc491c-5fc0-413d-854f-5dccac097c86"). InnerVolumeSpecName "kube-api-access-x7zlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.692437 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.692792 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zlq\" (UniqueName: \"kubernetes.io/projected/54cc491c-5fc0-413d-854f-5dccac097c86-kube-api-access-x7zlq\") on node \"crc\" DevicePath \"\"" Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.738300 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54cc491c-5fc0-413d-854f-5dccac097c86" (UID: "54cc491c-5fc0-413d-854f-5dccac097c86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:26:25 crc kubenswrapper[4766]: I1209 05:26:25.798238 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cc491c-5fc0-413d-854f-5dccac097c86-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.058842 4766 generic.go:334] "Generic (PLEG): container finished" podID="54cc491c-5fc0-413d-854f-5dccac097c86" containerID="dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24" exitCode=0 Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.058921 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vk5gv" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.059957 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk5gv" event={"ID":"54cc491c-5fc0-413d-854f-5dccac097c86","Type":"ContainerDied","Data":"dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24"} Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.060090 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk5gv" event={"ID":"54cc491c-5fc0-413d-854f-5dccac097c86","Type":"ContainerDied","Data":"db4f3650f23d3cca7f4ff21907d74e76535bb229af7a1eb530fc78b6880e4085"} Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.060153 4766 scope.go:117] "RemoveContainer" containerID="dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.102177 4766 scope.go:117] "RemoveContainer" containerID="895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.117361 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vk5gv"] Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.126192 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vk5gv"] Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.139084 4766 scope.go:117] "RemoveContainer" containerID="0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.189606 4766 scope.go:117] "RemoveContainer" containerID="dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24" Dec 09 05:26:26 crc kubenswrapper[4766]: E1209 05:26:26.189921 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24\": container with ID starting with dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24 not found: ID does not exist" containerID="dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.190032 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24"} err="failed to get container status \"dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24\": rpc error: code = NotFound desc = could not find container \"dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24\": container with ID starting with dd9dcdb9de942587d84c7291d2d719f885c21308ca25a0b7d9e85d767df3ee24 not found: ID does not exist" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.190118 4766 scope.go:117] "RemoveContainer" containerID="895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727" Dec 09 05:26:26 crc kubenswrapper[4766]: E1209 05:26:26.191011 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727\": container with ID starting with 895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727 not found: ID does not exist" containerID="895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.191086 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727"} err="failed to get container status \"895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727\": rpc error: code = NotFound desc = could not find container \"895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727\": container with ID starting with 895355d5272a05ad21243c7c838abfc268ed1631f3cfdd9f5221c8ed5f15b727 not found: ID does not exist" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.191159 4766 scope.go:117] "RemoveContainer" containerID="0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576" Dec 09 05:26:26 crc kubenswrapper[4766]: E1209 05:26:26.191781 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576\": container with ID starting with 0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576 not found: ID does not exist" containerID="0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.191817 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576"} err="failed to get container status \"0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576\": rpc error: code = NotFound desc = could not find container \"0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576\": container with ID starting with 0bfa9585d32d3921f20f3e3f42ab9f3bd9f63dd9907385be2c84a470ac3e5576 not found: ID does not exist" Dec 09 05:26:26 crc kubenswrapper[4766]: I1209 05:26:26.867594 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" path="/var/lib/kubelet/pods/54cc491c-5fc0-413d-854f-5dccac097c86/volumes" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.637136 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gqvjj"] Dec 09 05:26:32 crc kubenswrapper[4766]: E1209 05:26:32.639719 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" containerName="extract-content" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.639824 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" containerName="extract-content" Dec 09 05:26:32 crc kubenswrapper[4766]: E1209 05:26:32.639900 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" containerName="extract-utilities" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.639967 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" containerName="extract-utilities" Dec 09 05:26:32 crc kubenswrapper[4766]: E1209 05:26:32.640047 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" containerName="registry-server" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.640109 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" containerName="registry-server" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.640629 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="54cc491c-5fc0-413d-854f-5dccac097c86" containerName="registry-server" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.642348 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.654944 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqvjj"] Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.785277 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-catalog-content\") pod \"certified-operators-gqvjj\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.785624 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8wth\" (UniqueName: \"kubernetes.io/projected/b84b963a-e1b2-448c-a456-fb151f419417-kube-api-access-f8wth\") pod \"certified-operators-gqvjj\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.785946 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-utilities\") pod \"certified-operators-gqvjj\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.888345 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-catalog-content\") pod \"certified-operators-gqvjj\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.888480 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8wth\" (UniqueName: \"kubernetes.io/projected/b84b963a-e1b2-448c-a456-fb151f419417-kube-api-access-f8wth\") pod \"certified-operators-gqvjj\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.888519 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-utilities\") pod \"certified-operators-gqvjj\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.889047 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-utilities\") pod \"certified-operators-gqvjj\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.889085 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-catalog-content\") pod \"certified-operators-gqvjj\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.910583 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8wth\" (UniqueName: \"kubernetes.io/projected/b84b963a-e1b2-448c-a456-fb151f419417-kube-api-access-f8wth\") pod \"certified-operators-gqvjj\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:32 crc kubenswrapper[4766]: I1209 05:26:32.972182 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:33 crc kubenswrapper[4766]: I1209 05:26:33.497236 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqvjj"] Dec 09 05:26:34 crc kubenswrapper[4766]: I1209 05:26:34.173069 4766 generic.go:334] "Generic (PLEG): container finished" podID="b84b963a-e1b2-448c-a456-fb151f419417" containerID="5ec3de8f6bb29d4803ae7afcae8d67ebaba134ed49153e0f5a8d99e5ce88001a" exitCode=0 Dec 09 05:26:34 crc kubenswrapper[4766]: I1209 05:26:34.173359 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqvjj" event={"ID":"b84b963a-e1b2-448c-a456-fb151f419417","Type":"ContainerDied","Data":"5ec3de8f6bb29d4803ae7afcae8d67ebaba134ed49153e0f5a8d99e5ce88001a"} Dec 09 05:26:34 crc kubenswrapper[4766]: I1209 05:26:34.173385 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqvjj" event={"ID":"b84b963a-e1b2-448c-a456-fb151f419417","Type":"ContainerStarted","Data":"b3d3cc6e011542f84d5f30d242bf1449c0c290d1d7fe8b21bd07df05c1a3d7fb"} Dec 09 05:26:35 crc kubenswrapper[4766]: I1209 05:26:35.183375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqvjj" event={"ID":"b84b963a-e1b2-448c-a456-fb151f419417","Type":"ContainerStarted","Data":"1d28d3c46ea986d56a08fea91db2fff81e90d088c8b1ffd2806aa45af892c3e1"} Dec 09 05:26:36 crc kubenswrapper[4766]: I1209 05:26:36.204701 4766 generic.go:334] "Generic (PLEG): container finished" podID="b84b963a-e1b2-448c-a456-fb151f419417" containerID="1d28d3c46ea986d56a08fea91db2fff81e90d088c8b1ffd2806aa45af892c3e1" exitCode=0 Dec 09 05:26:36 crc kubenswrapper[4766]: I1209 05:26:36.204754 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqvjj" event={"ID":"b84b963a-e1b2-448c-a456-fb151f419417","Type":"ContainerDied","Data":"1d28d3c46ea986d56a08fea91db2fff81e90d088c8b1ffd2806aa45af892c3e1"} Dec 09 05:26:37 crc kubenswrapper[4766]: I1209 05:26:37.216998 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqvjj" event={"ID":"b84b963a-e1b2-448c-a456-fb151f419417","Type":"ContainerStarted","Data":"3a79c6fc3c81d8d2fcde21e0998ae00fe89f9559b6cba04e7921961d7585740d"} Dec 09 05:26:42 crc kubenswrapper[4766]: I1209 05:26:42.973341 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:42 crc kubenswrapper[4766]: I1209 05:26:42.973980 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:43 crc kubenswrapper[4766]: I1209 05:26:43.037845 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:43 crc kubenswrapper[4766]: I1209 05:26:43.058611 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gqvjj" podStartSLOduration=8.558661013 podStartE2EDuration="11.058593107s" podCreationTimestamp="2025-12-09 05:26:32 +0000 UTC" firstStartedPulling="2025-12-09 05:26:34.175501587 +0000 UTC m=+8075.884807023" lastFinishedPulling="2025-12-09 05:26:36.675433691 +0000 UTC m=+8078.384739117" observedRunningTime="2025-12-09 05:26:37.246648438 +0000 UTC m=+8078.955953864" watchObservedRunningTime="2025-12-09 05:26:43.058593107 +0000 UTC m=+8084.767898533" Dec 09 05:26:43 crc kubenswrapper[4766]: I1209 05:26:43.340740 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:43 crc kubenswrapper[4766]: I1209 05:26:43.389750 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqvjj"] Dec 09 05:26:45 crc kubenswrapper[4766]: I1209 05:26:45.300930 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gqvjj" podUID="b84b963a-e1b2-448c-a456-fb151f419417" containerName="registry-server" containerID="cri-o://3a79c6fc3c81d8d2fcde21e0998ae00fe89f9559b6cba04e7921961d7585740d" gracePeriod=2 Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.314671 4766 generic.go:334] "Generic (PLEG): container finished" podID="b84b963a-e1b2-448c-a456-fb151f419417" containerID="3a79c6fc3c81d8d2fcde21e0998ae00fe89f9559b6cba04e7921961d7585740d" exitCode=0 Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.314737 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqvjj" event={"ID":"b84b963a-e1b2-448c-a456-fb151f419417","Type":"ContainerDied","Data":"3a79c6fc3c81d8d2fcde21e0998ae00fe89f9559b6cba04e7921961d7585740d"} Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.314948 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqvjj" event={"ID":"b84b963a-e1b2-448c-a456-fb151f419417","Type":"ContainerDied","Data":"b3d3cc6e011542f84d5f30d242bf1449c0c290d1d7fe8b21bd07df05c1a3d7fb"} Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.314959 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3d3cc6e011542f84d5f30d242bf1449c0c290d1d7fe8b21bd07df05c1a3d7fb" Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.377135 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.562112 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8wth\" (UniqueName: \"kubernetes.io/projected/b84b963a-e1b2-448c-a456-fb151f419417-kube-api-access-f8wth\") pod \"b84b963a-e1b2-448c-a456-fb151f419417\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.562463 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-utilities\") pod \"b84b963a-e1b2-448c-a456-fb151f419417\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.562612 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-catalog-content\") pod \"b84b963a-e1b2-448c-a456-fb151f419417\" (UID: \"b84b963a-e1b2-448c-a456-fb151f419417\") " Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.563411 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-utilities" (OuterVolumeSpecName: "utilities") pod "b84b963a-e1b2-448c-a456-fb151f419417" (UID: "b84b963a-e1b2-448c-a456-fb151f419417"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.572594 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b84b963a-e1b2-448c-a456-fb151f419417-kube-api-access-f8wth" (OuterVolumeSpecName: "kube-api-access-f8wth") pod "b84b963a-e1b2-448c-a456-fb151f419417" (UID: "b84b963a-e1b2-448c-a456-fb151f419417"). InnerVolumeSpecName "kube-api-access-f8wth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.616895 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b84b963a-e1b2-448c-a456-fb151f419417" (UID: "b84b963a-e1b2-448c-a456-fb151f419417"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.665378 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.665606 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84b963a-e1b2-448c-a456-fb151f419417-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:26:46 crc kubenswrapper[4766]: I1209 05:26:46.665690 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8wth\" (UniqueName: \"kubernetes.io/projected/b84b963a-e1b2-448c-a456-fb151f419417-kube-api-access-f8wth\") on node \"crc\" DevicePath \"\"" Dec 09 05:26:47 crc kubenswrapper[4766]: I1209 05:26:47.322558 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqvjj" Dec 09 05:26:47 crc kubenswrapper[4766]: I1209 05:26:47.344089 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqvjj"] Dec 09 05:26:47 crc kubenswrapper[4766]: I1209 05:26:47.352840 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gqvjj"] Dec 09 05:26:48 crc kubenswrapper[4766]: I1209 05:26:48.863506 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b84b963a-e1b2-448c-a456-fb151f419417" path="/var/lib/kubelet/pods/b84b963a-e1b2-448c-a456-fb151f419417/volumes" Dec 09 05:28:37 crc kubenswrapper[4766]: I1209 05:28:37.317036 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:28:37 crc kubenswrapper[4766]: I1209 05:28:37.317837 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:29:07 crc kubenswrapper[4766]: I1209 05:29:07.316289 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:29:07 crc kubenswrapper[4766]: I1209 05:29:07.316790 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:29:20 crc kubenswrapper[4766]: I1209 05:29:20.015554 4766 generic.go:334] "Generic (PLEG): container finished" podID="0ef78455-720c-4e11-b479-4f665656e20b" containerID="5e8bfd621b4098b6b13429e612611832e0dab855d5990fd3616b840e3fd721ea" exitCode=0 Dec 09 05:29:20 crc kubenswrapper[4766]: I1209 05:29:20.015661 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" event={"ID":"0ef78455-720c-4e11-b479-4f665656e20b","Type":"ContainerDied","Data":"5e8bfd621b4098b6b13429e612611832e0dab855d5990fd3616b840e3fd721ea"} Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.492078 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.600993 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-secret-0\") pod \"0ef78455-720c-4e11-b479-4f665656e20b\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.601050 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlcf2\" (UniqueName: \"kubernetes.io/projected/0ef78455-720c-4e11-b479-4f665656e20b-kube-api-access-xlcf2\") pod \"0ef78455-720c-4e11-b479-4f665656e20b\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.601117 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-inventory\") pod \"0ef78455-720c-4e11-b479-4f665656e20b\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.601154 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ssh-key\") pod \"0ef78455-720c-4e11-b479-4f665656e20b\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.601190 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-combined-ca-bundle\") pod \"0ef78455-720c-4e11-b479-4f665656e20b\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.601258 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ceph\") pod \"0ef78455-720c-4e11-b479-4f665656e20b\" (UID: \"0ef78455-720c-4e11-b479-4f665656e20b\") " Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.617729 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ef78455-720c-4e11-b479-4f665656e20b-kube-api-access-xlcf2" (OuterVolumeSpecName: "kube-api-access-xlcf2") pod "0ef78455-720c-4e11-b479-4f665656e20b" (UID: "0ef78455-720c-4e11-b479-4f665656e20b"). InnerVolumeSpecName "kube-api-access-xlcf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.652381 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0ef78455-720c-4e11-b479-4f665656e20b" (UID: "0ef78455-720c-4e11-b479-4f665656e20b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.652715 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ceph" (OuterVolumeSpecName: "ceph") pod "0ef78455-720c-4e11-b479-4f665656e20b" (UID: "0ef78455-720c-4e11-b479-4f665656e20b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.713074 4766 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.713104 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.713117 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlcf2\" (UniqueName: \"kubernetes.io/projected/0ef78455-720c-4e11-b479-4f665656e20b-kube-api-access-xlcf2\") on node \"crc\" DevicePath \"\"" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.762648 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0ef78455-720c-4e11-b479-4f665656e20b" (UID: "0ef78455-720c-4e11-b479-4f665656e20b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.777373 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0ef78455-720c-4e11-b479-4f665656e20b" (UID: "0ef78455-720c-4e11-b479-4f665656e20b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.818485 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-inventory" (OuterVolumeSpecName: "inventory") pod "0ef78455-720c-4e11-b479-4f665656e20b" (UID: "0ef78455-720c-4e11-b479-4f665656e20b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.820829 4766 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.820886 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:21.820896 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ef78455-720c-4e11-b479-4f665656e20b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.041003 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" event={"ID":"0ef78455-720c-4e11-b479-4f665656e20b","Type":"ContainerDied","Data":"997352010bda507d8b3c937dad0a9e83e110abd47c29b106388ecbc08fa52c3a"} Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.041038 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="997352010bda507d8b3c937dad0a9e83e110abd47c29b106388ecbc08fa52c3a" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.041082 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-r2jnr" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.142686 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-hjl6d"] Dec 09 05:29:22 crc kubenswrapper[4766]: E1209 05:29:22.143201 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84b963a-e1b2-448c-a456-fb151f419417" containerName="extract-utilities" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.143240 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84b963a-e1b2-448c-a456-fb151f419417" containerName="extract-utilities" Dec 09 05:29:22 crc kubenswrapper[4766]: E1209 05:29:22.143256 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef78455-720c-4e11-b479-4f665656e20b" containerName="libvirt-openstack-openstack-cell1" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.143265 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef78455-720c-4e11-b479-4f665656e20b" containerName="libvirt-openstack-openstack-cell1" Dec 09 05:29:22 crc kubenswrapper[4766]: E1209 05:29:22.143324 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84b963a-e1b2-448c-a456-fb151f419417" containerName="registry-server" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.143333 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84b963a-e1b2-448c-a456-fb151f419417" containerName="registry-server" Dec 09 05:29:22 crc kubenswrapper[4766]: E1209 05:29:22.143356 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84b963a-e1b2-448c-a456-fb151f419417" containerName="extract-content" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.143364 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84b963a-e1b2-448c-a456-fb151f419417" containerName="extract-content" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.143642 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b84b963a-e1b2-448c-a456-fb151f419417" containerName="registry-server" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.143691 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ef78455-720c-4e11-b479-4f665656e20b" containerName="libvirt-openstack-openstack-cell1" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.144661 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.148656 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.148819 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.148882 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.148912 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.150125 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.150389 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.157173 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-hjl6d"] Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.159232 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.331822 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332021 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-inventory\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332091 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332203 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332266 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332295 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332325 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332516 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbg7q\" (UniqueName: \"kubernetes.io/projected/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-kube-api-access-rbg7q\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332706 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332766 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.332840 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ceph\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.435748 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.435953 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-inventory\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.436009 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.436066 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.436098 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.436128 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.436172 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.436305 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbg7q\" (UniqueName: \"kubernetes.io/projected/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-kube-api-access-rbg7q\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.436394 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.437095 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.437767 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ceph\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.437318 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.438509 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.442243 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.443092 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.443685 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.444264 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.444762 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ceph\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.445276 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-inventory\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.445551 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.448786 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.455556 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbg7q\" (UniqueName: \"kubernetes.io/projected/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-kube-api-access-rbg7q\") pod \"nova-cell1-openstack-openstack-cell1-hjl6d\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:22 crc kubenswrapper[4766]: I1209 05:29:22.479509 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:29:23 crc kubenswrapper[4766]: I1209 05:29:23.058667 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-hjl6d"] Dec 09 05:29:23 crc kubenswrapper[4766]: I1209 05:29:23.059435 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:29:24 crc kubenswrapper[4766]: I1209 05:29:24.075533 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" event={"ID":"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9","Type":"ContainerStarted","Data":"c7c6052422685762ac3ae637be1bac7b11a519cae6f6fdf2fa0982a2bc777c6e"} Dec 09 05:29:24 crc kubenswrapper[4766]: I1209 05:29:24.075879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" event={"ID":"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9","Type":"ContainerStarted","Data":"2348302638d4016ddfe7635bf75eb1206cd2a426837912cadcf8423f7a71d597"} Dec 09 05:29:24 crc kubenswrapper[4766]: I1209 05:29:24.103227 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" podStartSLOduration=1.923368816 podStartE2EDuration="2.103189083s" podCreationTimestamp="2025-12-09 05:29:22 +0000 UTC" firstStartedPulling="2025-12-09 05:29:23.059061261 +0000 UTC m=+8244.768366687" lastFinishedPulling="2025-12-09 05:29:23.238881488 +0000 UTC m=+8244.948186954" observedRunningTime="2025-12-09 05:29:24.100546182 +0000 UTC m=+8245.809851608" watchObservedRunningTime="2025-12-09 05:29:24.103189083 +0000 UTC m=+8245.812494509" Dec 09 05:29:37 crc kubenswrapper[4766]: I1209 05:29:37.316524 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:29:37 crc kubenswrapper[4766]: I1209 05:29:37.316932 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:29:37 crc kubenswrapper[4766]: I1209 05:29:37.316981 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:29:37 crc kubenswrapper[4766]: I1209 05:29:37.317975 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0115129135ca679d8e3aa30f2321dde0fc2dc2a485d2fb80532fe4fd9d7c79b2"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:29:37 crc kubenswrapper[4766]: I1209 05:29:37.318090 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://0115129135ca679d8e3aa30f2321dde0fc2dc2a485d2fb80532fe4fd9d7c79b2" gracePeriod=600 Dec 09 05:29:38 crc kubenswrapper[4766]: I1209 05:29:38.213028 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="0115129135ca679d8e3aa30f2321dde0fc2dc2a485d2fb80532fe4fd9d7c79b2" exitCode=0 Dec 09 05:29:38 crc kubenswrapper[4766]: I1209 05:29:38.213105 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"0115129135ca679d8e3aa30f2321dde0fc2dc2a485d2fb80532fe4fd9d7c79b2"} Dec 09 05:29:38 crc kubenswrapper[4766]: I1209 05:29:38.213400 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f"} Dec 09 05:29:38 crc kubenswrapper[4766]: I1209 05:29:38.213421 4766 scope.go:117] "RemoveContainer" containerID="0d9b64651ed53ff07894b1947b773c9e9d13926afe6278671783e2f1208cb562" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.172488 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb"] Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.174945 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.177944 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.178068 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.200838 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb"] Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.279263 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdp4x\" (UniqueName: \"kubernetes.io/projected/f2aa4d62-9d9b-4906-ba66-56d149431aba-kube-api-access-gdp4x\") pod \"collect-profiles-29420970-pwxzb\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.279452 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2aa4d62-9d9b-4906-ba66-56d149431aba-config-volume\") pod \"collect-profiles-29420970-pwxzb\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.279826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2aa4d62-9d9b-4906-ba66-56d149431aba-secret-volume\") pod \"collect-profiles-29420970-pwxzb\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.381432 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdp4x\" (UniqueName: \"kubernetes.io/projected/f2aa4d62-9d9b-4906-ba66-56d149431aba-kube-api-access-gdp4x\") pod \"collect-profiles-29420970-pwxzb\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.381512 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2aa4d62-9d9b-4906-ba66-56d149431aba-config-volume\") pod \"collect-profiles-29420970-pwxzb\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.381609 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2aa4d62-9d9b-4906-ba66-56d149431aba-secret-volume\") pod \"collect-profiles-29420970-pwxzb\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.382471 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2aa4d62-9d9b-4906-ba66-56d149431aba-config-volume\") pod \"collect-profiles-29420970-pwxzb\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.387729 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2aa4d62-9d9b-4906-ba66-56d149431aba-secret-volume\") pod \"collect-profiles-29420970-pwxzb\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.398569 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdp4x\" (UniqueName: \"kubernetes.io/projected/f2aa4d62-9d9b-4906-ba66-56d149431aba-kube-api-access-gdp4x\") pod \"collect-profiles-29420970-pwxzb\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.506927 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:00 crc kubenswrapper[4766]: I1209 05:30:00.956664 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb"] Dec 09 05:30:01 crc kubenswrapper[4766]: I1209 05:30:01.451699 4766 generic.go:334] "Generic (PLEG): container finished" podID="f2aa4d62-9d9b-4906-ba66-56d149431aba" containerID="36ad6b31993cf9c14e3c4f4e685c94e0097dafbcc8f54ffe8f0091552dd5915d" exitCode=0 Dec 09 05:30:01 crc kubenswrapper[4766]: I1209 05:30:01.451772 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" event={"ID":"f2aa4d62-9d9b-4906-ba66-56d149431aba","Type":"ContainerDied","Data":"36ad6b31993cf9c14e3c4f4e685c94e0097dafbcc8f54ffe8f0091552dd5915d"} Dec 09 05:30:01 crc kubenswrapper[4766]: I1209 05:30:01.452119 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" event={"ID":"f2aa4d62-9d9b-4906-ba66-56d149431aba","Type":"ContainerStarted","Data":"2056da349a7f5394d8a90bccc562a038aae40266e3230be78d1e55b78ab0998f"} Dec 09 05:30:02 crc kubenswrapper[4766]: I1209 05:30:02.834748 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:02 crc kubenswrapper[4766]: I1209 05:30:02.940538 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2aa4d62-9d9b-4906-ba66-56d149431aba-secret-volume\") pod \"f2aa4d62-9d9b-4906-ba66-56d149431aba\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " Dec 09 05:30:02 crc kubenswrapper[4766]: I1209 05:30:02.940732 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdp4x\" (UniqueName: \"kubernetes.io/projected/f2aa4d62-9d9b-4906-ba66-56d149431aba-kube-api-access-gdp4x\") pod \"f2aa4d62-9d9b-4906-ba66-56d149431aba\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " Dec 09 05:30:02 crc kubenswrapper[4766]: I1209 05:30:02.940801 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2aa4d62-9d9b-4906-ba66-56d149431aba-config-volume\") pod \"f2aa4d62-9d9b-4906-ba66-56d149431aba\" (UID: \"f2aa4d62-9d9b-4906-ba66-56d149431aba\") " Dec 09 05:30:02 crc kubenswrapper[4766]: I1209 05:30:02.941775 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2aa4d62-9d9b-4906-ba66-56d149431aba-config-volume" (OuterVolumeSpecName: "config-volume") pod "f2aa4d62-9d9b-4906-ba66-56d149431aba" (UID: "f2aa4d62-9d9b-4906-ba66-56d149431aba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:30:02 crc kubenswrapper[4766]: I1209 05:30:02.947651 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2aa4d62-9d9b-4906-ba66-56d149431aba-kube-api-access-gdp4x" (OuterVolumeSpecName: "kube-api-access-gdp4x") pod "f2aa4d62-9d9b-4906-ba66-56d149431aba" (UID: "f2aa4d62-9d9b-4906-ba66-56d149431aba"). InnerVolumeSpecName "kube-api-access-gdp4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:30:02 crc kubenswrapper[4766]: I1209 05:30:02.948090 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2aa4d62-9d9b-4906-ba66-56d149431aba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f2aa4d62-9d9b-4906-ba66-56d149431aba" (UID: "f2aa4d62-9d9b-4906-ba66-56d149431aba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:30:03 crc kubenswrapper[4766]: I1209 05:30:03.043843 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2aa4d62-9d9b-4906-ba66-56d149431aba-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 05:30:03 crc kubenswrapper[4766]: I1209 05:30:03.043883 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdp4x\" (UniqueName: \"kubernetes.io/projected/f2aa4d62-9d9b-4906-ba66-56d149431aba-kube-api-access-gdp4x\") on node \"crc\" DevicePath \"\"" Dec 09 05:30:03 crc kubenswrapper[4766]: I1209 05:30:03.043901 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2aa4d62-9d9b-4906-ba66-56d149431aba-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 05:30:03 crc kubenswrapper[4766]: I1209 05:30:03.471923 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" event={"ID":"f2aa4d62-9d9b-4906-ba66-56d149431aba","Type":"ContainerDied","Data":"2056da349a7f5394d8a90bccc562a038aae40266e3230be78d1e55b78ab0998f"} Dec 09 05:30:03 crc kubenswrapper[4766]: I1209 05:30:03.471967 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2056da349a7f5394d8a90bccc562a038aae40266e3230be78d1e55b78ab0998f" Dec 09 05:30:03 crc kubenswrapper[4766]: I1209 05:30:03.472264 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420970-pwxzb" Dec 09 05:30:03 crc kubenswrapper[4766]: I1209 05:30:03.912314 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6"] Dec 09 05:30:03 crc kubenswrapper[4766]: I1209 05:30:03.922023 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420925-npxr6"] Dec 09 05:30:04 crc kubenswrapper[4766]: I1209 05:30:04.860924 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3861350-bc7d-4211-a4ba-fe76779c97f7" path="/var/lib/kubelet/pods/c3861350-bc7d-4211-a4ba-fe76779c97f7/volumes" Dec 09 05:30:26 crc kubenswrapper[4766]: I1209 05:30:26.996693 4766 scope.go:117] "RemoveContainer" containerID="a5921f833aef397b51a6560d82869249a07eff39811a083068a2f713dced4e64" Dec 09 05:32:07 crc kubenswrapper[4766]: I1209 05:32:07.317001 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:32:07 crc kubenswrapper[4766]: I1209 05:32:07.317449 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:32:10 crc kubenswrapper[4766]: I1209 05:32:10.787537 4766 generic.go:334] "Generic (PLEG): container finished" podID="5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" containerID="c7c6052422685762ac3ae637be1bac7b11a519cae6f6fdf2fa0982a2bc777c6e" exitCode=0 Dec 09 05:32:10 crc kubenswrapper[4766]: I1209 05:32:10.787626 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" event={"ID":"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9","Type":"ContainerDied","Data":"c7c6052422685762ac3ae637be1bac7b11a519cae6f6fdf2fa0982a2bc777c6e"} Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.294600 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.451004 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ceph\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.451176 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-0\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.451284 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-0\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.451403 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-combined-ca-bundle\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.451596 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ssh-key\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.451662 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-1\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.451746 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-inventory\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.451795 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-1\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.451884 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbg7q\" (UniqueName: \"kubernetes.io/projected/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-kube-api-access-rbg7q\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.452000 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-1\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.452097 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-0\") pod \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\" (UID: \"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9\") " Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.458964 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-kube-api-access-rbg7q" (OuterVolumeSpecName: "kube-api-access-rbg7q") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "kube-api-access-rbg7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.467992 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.470924 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ceph" (OuterVolumeSpecName: "ceph") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.487017 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.487850 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.488274 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.490675 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.491527 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.493889 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.494025 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-inventory" (OuterVolumeSpecName: "inventory") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.510589 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" (UID: "5ff84ae1-d39e-43fb-8ceb-2bd935d486b9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555535 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555572 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555586 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555596 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555604 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbg7q\" (UniqueName: \"kubernetes.io/projected/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-kube-api-access-rbg7q\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555614 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555622 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555630 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555639 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555646 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.555654 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff84ae1-d39e-43fb-8ceb-2bd935d486b9-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.808000 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" event={"ID":"5ff84ae1-d39e-43fb-8ceb-2bd935d486b9","Type":"ContainerDied","Data":"2348302638d4016ddfe7635bf75eb1206cd2a426837912cadcf8423f7a71d597"} Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.808055 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2348302638d4016ddfe7635bf75eb1206cd2a426837912cadcf8423f7a71d597" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.808081 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-hjl6d" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.928691 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-nskww"] Dec 09 05:32:12 crc kubenswrapper[4766]: E1209 05:32:12.929656 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" containerName="nova-cell1-openstack-openstack-cell1" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.929750 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" containerName="nova-cell1-openstack-openstack-cell1" Dec 09 05:32:12 crc kubenswrapper[4766]: E1209 05:32:12.929888 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2aa4d62-9d9b-4906-ba66-56d149431aba" containerName="collect-profiles" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.929957 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2aa4d62-9d9b-4906-ba66-56d149431aba" containerName="collect-profiles" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.931543 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2aa4d62-9d9b-4906-ba66-56d149431aba" containerName="collect-profiles" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.931702 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff84ae1-d39e-43fb-8ceb-2bd935d486b9" containerName="nova-cell1-openstack-openstack-cell1" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.932753 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.935083 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.935281 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.935562 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.935770 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.936122 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:32:12 crc kubenswrapper[4766]: I1209 05:32:12.940777 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-nskww"] Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.069959 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.070080 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qncjq\" (UniqueName: \"kubernetes.io/projected/2cc13e31-7024-4112-bddd-c79864a1df19-kube-api-access-qncjq\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.070110 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceph\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.070131 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.070150 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.070185 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ssh-key\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.070279 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.070312 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-inventory\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.171828 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-inventory\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.171951 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.172120 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qncjq\" (UniqueName: \"kubernetes.io/projected/2cc13e31-7024-4112-bddd-c79864a1df19-kube-api-access-qncjq\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.172163 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceph\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.172189 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.172242 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.172301 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ssh-key\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.172349 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.176483 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceph\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.176596 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.176917 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.177291 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ssh-key\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.178032 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-inventory\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.178075 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.189932 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.196792 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qncjq\" (UniqueName: \"kubernetes.io/projected/2cc13e31-7024-4112-bddd-c79864a1df19-kube-api-access-qncjq\") pod \"telemetry-openstack-openstack-cell1-nskww\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.260491 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:32:13 crc kubenswrapper[4766]: I1209 05:32:13.867264 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-nskww"] Dec 09 05:32:14 crc kubenswrapper[4766]: I1209 05:32:14.868982 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-nskww" event={"ID":"2cc13e31-7024-4112-bddd-c79864a1df19","Type":"ContainerStarted","Data":"751f391a0342c2eccefeac2b5509c21740383ec319b0ac4611a24e914de391ab"} Dec 09 05:32:14 crc kubenswrapper[4766]: I1209 05:32:14.869284 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-nskww" event={"ID":"2cc13e31-7024-4112-bddd-c79864a1df19","Type":"ContainerStarted","Data":"fbbe5b4a25c0a6c656ec450beab7a5ec00fd20b048e7d822c358ae5316c3ec30"} Dec 09 05:32:14 crc kubenswrapper[4766]: I1209 05:32:14.891474 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-nskww" podStartSLOduration=2.707688136 podStartE2EDuration="2.891441421s" podCreationTimestamp="2025-12-09 05:32:12 +0000 UTC" firstStartedPulling="2025-12-09 05:32:13.874455172 +0000 UTC m=+8415.583760598" lastFinishedPulling="2025-12-09 05:32:14.058208457 +0000 UTC m=+8415.767513883" observedRunningTime="2025-12-09 05:32:14.889120398 +0000 UTC m=+8416.598425834" watchObservedRunningTime="2025-12-09 05:32:14.891441421 +0000 UTC m=+8416.600746847" Dec 09 05:32:37 crc kubenswrapper[4766]: I1209 05:32:37.316270 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:32:37 crc kubenswrapper[4766]: I1209 05:32:37.316779 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:33:07 crc kubenswrapper[4766]: I1209 05:33:07.316182 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:33:07 crc kubenswrapper[4766]: I1209 05:33:07.316916 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:33:07 crc kubenswrapper[4766]: I1209 05:33:07.316971 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:33:07 crc kubenswrapper[4766]: I1209 05:33:07.317928 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:33:07 crc kubenswrapper[4766]: I1209 05:33:07.317976 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" gracePeriod=600 Dec 09 05:33:07 crc kubenswrapper[4766]: E1209 05:33:07.454185 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:33:07 crc kubenswrapper[4766]: I1209 05:33:07.503489 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" exitCode=0 Dec 09 05:33:07 crc kubenswrapper[4766]: I1209 05:33:07.503539 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f"} Dec 09 05:33:07 crc kubenswrapper[4766]: I1209 05:33:07.503577 4766 scope.go:117] "RemoveContainer" containerID="0115129135ca679d8e3aa30f2321dde0fc2dc2a485d2fb80532fe4fd9d7c79b2" Dec 09 05:33:07 crc kubenswrapper[4766]: I1209 05:33:07.504139 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:33:07 crc kubenswrapper[4766]: E1209 05:33:07.504512 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:33:19 crc kubenswrapper[4766]: I1209 05:33:19.840622 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:33:19 crc kubenswrapper[4766]: E1209 05:33:19.841507 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:33:27 crc kubenswrapper[4766]: I1209 05:33:27.096186 4766 scope.go:117] "RemoveContainer" containerID="5ec3de8f6bb29d4803ae7afcae8d67ebaba134ed49153e0f5a8d99e5ce88001a" Dec 09 05:33:27 crc kubenswrapper[4766]: I1209 05:33:27.125473 4766 scope.go:117] "RemoveContainer" containerID="1d28d3c46ea986d56a08fea91db2fff81e90d088c8b1ffd2806aa45af892c3e1" Dec 09 05:33:27 crc kubenswrapper[4766]: I1209 05:33:27.197806 4766 scope.go:117] "RemoveContainer" containerID="3a79c6fc3c81d8d2fcde21e0998ae00fe89f9559b6cba04e7921961d7585740d" Dec 09 05:33:34 crc kubenswrapper[4766]: I1209 05:33:34.838763 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:33:34 crc kubenswrapper[4766]: E1209 05:33:34.839677 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:33:48 crc kubenswrapper[4766]: I1209 05:33:48.847026 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:33:48 crc kubenswrapper[4766]: E1209 05:33:48.847752 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:34:03 crc kubenswrapper[4766]: I1209 05:34:03.840431 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:34:03 crc kubenswrapper[4766]: E1209 05:34:03.841478 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.714673 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9gmcj"] Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.719286 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.724946 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gmcj"] Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.801940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-utilities\") pod \"community-operators-9gmcj\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.802020 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7pgc\" (UniqueName: \"kubernetes.io/projected/123be68a-5f03-429d-acfe-4d2af434a718-kube-api-access-v7pgc\") pod \"community-operators-9gmcj\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.802059 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-catalog-content\") pod \"community-operators-9gmcj\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.903848 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-utilities\") pod \"community-operators-9gmcj\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.903948 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7pgc\" (UniqueName: \"kubernetes.io/projected/123be68a-5f03-429d-acfe-4d2af434a718-kube-api-access-v7pgc\") pod \"community-operators-9gmcj\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.903988 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-catalog-content\") pod \"community-operators-9gmcj\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.904530 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-utilities\") pod \"community-operators-9gmcj\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.904561 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-catalog-content\") pod \"community-operators-9gmcj\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:09 crc kubenswrapper[4766]: I1209 05:34:09.931301 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7pgc\" (UniqueName: \"kubernetes.io/projected/123be68a-5f03-429d-acfe-4d2af434a718-kube-api-access-v7pgc\") pod \"community-operators-9gmcj\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:10 crc kubenswrapper[4766]: I1209 05:34:10.051554 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:10 crc kubenswrapper[4766]: I1209 05:34:10.603437 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gmcj"] Dec 09 05:34:11 crc kubenswrapper[4766]: I1209 05:34:11.219341 4766 generic.go:334] "Generic (PLEG): container finished" podID="123be68a-5f03-429d-acfe-4d2af434a718" containerID="755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49" exitCode=0 Dec 09 05:34:11 crc kubenswrapper[4766]: I1209 05:34:11.219477 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gmcj" event={"ID":"123be68a-5f03-429d-acfe-4d2af434a718","Type":"ContainerDied","Data":"755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49"} Dec 09 05:34:11 crc kubenswrapper[4766]: I1209 05:34:11.219797 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gmcj" event={"ID":"123be68a-5f03-429d-acfe-4d2af434a718","Type":"ContainerStarted","Data":"d71e7e32a0f05a092d767c5a1b18c5ac1f1e8aad2cccfc0c8423b7ca20c765e5"} Dec 09 05:34:13 crc kubenswrapper[4766]: I1209 05:34:13.239327 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gmcj" event={"ID":"123be68a-5f03-429d-acfe-4d2af434a718","Type":"ContainerStarted","Data":"f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6"} Dec 09 05:34:14 crc kubenswrapper[4766]: I1209 05:34:14.249531 4766 generic.go:334] "Generic (PLEG): container finished" podID="123be68a-5f03-429d-acfe-4d2af434a718" containerID="f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6" exitCode=0 Dec 09 05:34:14 crc kubenswrapper[4766]: I1209 05:34:14.249596 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gmcj" event={"ID":"123be68a-5f03-429d-acfe-4d2af434a718","Type":"ContainerDied","Data":"f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6"} Dec 09 05:34:14 crc kubenswrapper[4766]: I1209 05:34:14.839410 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:34:14 crc kubenswrapper[4766]: E1209 05:34:14.839935 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:34:15 crc kubenswrapper[4766]: I1209 05:34:15.265242 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gmcj" event={"ID":"123be68a-5f03-429d-acfe-4d2af434a718","Type":"ContainerStarted","Data":"37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6"} Dec 09 05:34:15 crc kubenswrapper[4766]: I1209 05:34:15.288331 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9gmcj" podStartSLOduration=2.863887004 podStartE2EDuration="6.288309719s" podCreationTimestamp="2025-12-09 05:34:09 +0000 UTC" firstStartedPulling="2025-12-09 05:34:11.223318884 +0000 UTC m=+8532.932624340" lastFinishedPulling="2025-12-09 05:34:14.647741629 +0000 UTC m=+8536.357047055" observedRunningTime="2025-12-09 05:34:15.287087265 +0000 UTC m=+8536.996392691" watchObservedRunningTime="2025-12-09 05:34:15.288309719 +0000 UTC m=+8536.997615145" Dec 09 05:34:20 crc kubenswrapper[4766]: I1209 05:34:20.052962 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:20 crc kubenswrapper[4766]: I1209 05:34:20.053775 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:20 crc kubenswrapper[4766]: I1209 05:34:20.120898 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:20 crc kubenswrapper[4766]: I1209 05:34:20.442195 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:20 crc kubenswrapper[4766]: I1209 05:34:20.522579 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gmcj"] Dec 09 05:34:22 crc kubenswrapper[4766]: I1209 05:34:22.353098 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9gmcj" podUID="123be68a-5f03-429d-acfe-4d2af434a718" containerName="registry-server" containerID="cri-o://37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6" gracePeriod=2 Dec 09 05:34:22 crc kubenswrapper[4766]: I1209 05:34:22.987610 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.099477 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7pgc\" (UniqueName: \"kubernetes.io/projected/123be68a-5f03-429d-acfe-4d2af434a718-kube-api-access-v7pgc\") pod \"123be68a-5f03-429d-acfe-4d2af434a718\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.099900 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-utilities\") pod \"123be68a-5f03-429d-acfe-4d2af434a718\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.099983 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-catalog-content\") pod \"123be68a-5f03-429d-acfe-4d2af434a718\" (UID: \"123be68a-5f03-429d-acfe-4d2af434a718\") " Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.101175 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-utilities" (OuterVolumeSpecName: "utilities") pod "123be68a-5f03-429d-acfe-4d2af434a718" (UID: "123be68a-5f03-429d-acfe-4d2af434a718"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.109477 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/123be68a-5f03-429d-acfe-4d2af434a718-kube-api-access-v7pgc" (OuterVolumeSpecName: "kube-api-access-v7pgc") pod "123be68a-5f03-429d-acfe-4d2af434a718" (UID: "123be68a-5f03-429d-acfe-4d2af434a718"). InnerVolumeSpecName "kube-api-access-v7pgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.182162 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "123be68a-5f03-429d-acfe-4d2af434a718" (UID: "123be68a-5f03-429d-acfe-4d2af434a718"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.201819 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7pgc\" (UniqueName: \"kubernetes.io/projected/123be68a-5f03-429d-acfe-4d2af434a718-kube-api-access-v7pgc\") on node \"crc\" DevicePath \"\"" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.201849 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.201859 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123be68a-5f03-429d-acfe-4d2af434a718-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.369950 4766 generic.go:334] "Generic (PLEG): container finished" podID="123be68a-5f03-429d-acfe-4d2af434a718" containerID="37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6" exitCode=0 Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.370060 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gmcj" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.370301 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gmcj" event={"ID":"123be68a-5f03-429d-acfe-4d2af434a718","Type":"ContainerDied","Data":"37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6"} Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.371732 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gmcj" event={"ID":"123be68a-5f03-429d-acfe-4d2af434a718","Type":"ContainerDied","Data":"d71e7e32a0f05a092d767c5a1b18c5ac1f1e8aad2cccfc0c8423b7ca20c765e5"} Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.371812 4766 scope.go:117] "RemoveContainer" containerID="37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.410620 4766 scope.go:117] "RemoveContainer" containerID="f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.420057 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gmcj"] Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.432227 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9gmcj"] Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.443881 4766 scope.go:117] "RemoveContainer" containerID="755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.507807 4766 scope.go:117] "RemoveContainer" containerID="37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6" Dec 09 05:34:23 crc kubenswrapper[4766]: E1209 05:34:23.508083 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6\": container with ID starting with 37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6 not found: ID does not exist" containerID="37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.508120 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6"} err="failed to get container status \"37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6\": rpc error: code = NotFound desc = could not find container \"37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6\": container with ID starting with 37c0487dd358671e3f4ba3141261bf0082877288a2226c3a3e15353292f166a6 not found: ID does not exist" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.508137 4766 scope.go:117] "RemoveContainer" containerID="f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6" Dec 09 05:34:23 crc kubenswrapper[4766]: E1209 05:34:23.508398 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6\": container with ID starting with f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6 not found: ID does not exist" containerID="f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.508428 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6"} err="failed to get container status \"f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6\": rpc error: code = NotFound desc = could not find container \"f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6\": container with ID starting with f1e873df3c949e0bc339415df6397834e71fdc21d9c00ebc754a33e80c805ac6 not found: ID does not exist" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.508447 4766 scope.go:117] "RemoveContainer" containerID="755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49" Dec 09 05:34:23 crc kubenswrapper[4766]: E1209 05:34:23.508658 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49\": container with ID starting with 755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49 not found: ID does not exist" containerID="755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49" Dec 09 05:34:23 crc kubenswrapper[4766]: I1209 05:34:23.508686 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49"} err="failed to get container status \"755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49\": rpc error: code = NotFound desc = could not find container \"755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49\": container with ID starting with 755e4087402abdf1ed312e42e7e4631a38198bf8e03074100eb1f94796457d49 not found: ID does not exist" Dec 09 05:34:24 crc kubenswrapper[4766]: I1209 05:34:24.854296 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="123be68a-5f03-429d-acfe-4d2af434a718" path="/var/lib/kubelet/pods/123be68a-5f03-429d-acfe-4d2af434a718/volumes" Dec 09 05:34:28 crc kubenswrapper[4766]: I1209 05:34:28.856076 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:34:28 crc kubenswrapper[4766]: E1209 05:34:28.857132 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:34:42 crc kubenswrapper[4766]: I1209 05:34:42.839342 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:34:42 crc kubenswrapper[4766]: E1209 05:34:42.840153 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:34:55 crc kubenswrapper[4766]: I1209 05:34:55.839504 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:34:55 crc kubenswrapper[4766]: E1209 05:34:55.840171 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:35:09 crc kubenswrapper[4766]: I1209 05:35:09.840134 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:35:09 crc kubenswrapper[4766]: E1209 05:35:09.841609 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:35:21 crc kubenswrapper[4766]: I1209 05:35:21.839412 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:35:21 crc kubenswrapper[4766]: E1209 05:35:21.842975 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:35:33 crc kubenswrapper[4766]: I1209 05:35:33.840318 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:35:33 crc kubenswrapper[4766]: E1209 05:35:33.841093 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:35:45 crc kubenswrapper[4766]: I1209 05:35:45.840589 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:35:45 crc kubenswrapper[4766]: E1209 05:35:45.841829 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:35:57 crc kubenswrapper[4766]: I1209 05:35:57.840096 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:35:57 crc kubenswrapper[4766]: E1209 05:35:57.841068 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:36:11 crc kubenswrapper[4766]: I1209 05:36:11.840140 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:36:11 crc kubenswrapper[4766]: E1209 05:36:11.840768 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:36:17 crc kubenswrapper[4766]: I1209 05:36:17.763690 4766 generic.go:334] "Generic (PLEG): container finished" podID="2cc13e31-7024-4112-bddd-c79864a1df19" containerID="751f391a0342c2eccefeac2b5509c21740383ec319b0ac4611a24e914de391ab" exitCode=0 Dec 09 05:36:17 crc kubenswrapper[4766]: I1209 05:36:17.763747 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-nskww" event={"ID":"2cc13e31-7024-4112-bddd-c79864a1df19","Type":"ContainerDied","Data":"751f391a0342c2eccefeac2b5509c21740383ec319b0ac4611a24e914de391ab"} Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.331281 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.427998 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceph\") pod \"2cc13e31-7024-4112-bddd-c79864a1df19\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.428057 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-0\") pod \"2cc13e31-7024-4112-bddd-c79864a1df19\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.428106 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ssh-key\") pod \"2cc13e31-7024-4112-bddd-c79864a1df19\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.428174 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-1\") pod \"2cc13e31-7024-4112-bddd-c79864a1df19\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.428224 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-telemetry-combined-ca-bundle\") pod \"2cc13e31-7024-4112-bddd-c79864a1df19\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.429002 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-inventory\") pod \"2cc13e31-7024-4112-bddd-c79864a1df19\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.429082 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qncjq\" (UniqueName: \"kubernetes.io/projected/2cc13e31-7024-4112-bddd-c79864a1df19-kube-api-access-qncjq\") pod \"2cc13e31-7024-4112-bddd-c79864a1df19\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.429154 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-2\") pod \"2cc13e31-7024-4112-bddd-c79864a1df19\" (UID: \"2cc13e31-7024-4112-bddd-c79864a1df19\") " Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.433634 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceph" (OuterVolumeSpecName: "ceph") pod "2cc13e31-7024-4112-bddd-c79864a1df19" (UID: "2cc13e31-7024-4112-bddd-c79864a1df19"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.434021 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc13e31-7024-4112-bddd-c79864a1df19-kube-api-access-qncjq" (OuterVolumeSpecName: "kube-api-access-qncjq") pod "2cc13e31-7024-4112-bddd-c79864a1df19" (UID: "2cc13e31-7024-4112-bddd-c79864a1df19"). InnerVolumeSpecName "kube-api-access-qncjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.434334 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2cc13e31-7024-4112-bddd-c79864a1df19" (UID: "2cc13e31-7024-4112-bddd-c79864a1df19"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.458064 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2cc13e31-7024-4112-bddd-c79864a1df19" (UID: "2cc13e31-7024-4112-bddd-c79864a1df19"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.464127 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-inventory" (OuterVolumeSpecName: "inventory") pod "2cc13e31-7024-4112-bddd-c79864a1df19" (UID: "2cc13e31-7024-4112-bddd-c79864a1df19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.466481 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2cc13e31-7024-4112-bddd-c79864a1df19" (UID: "2cc13e31-7024-4112-bddd-c79864a1df19"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.468538 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2cc13e31-7024-4112-bddd-c79864a1df19" (UID: "2cc13e31-7024-4112-bddd-c79864a1df19"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.489614 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2cc13e31-7024-4112-bddd-c79864a1df19" (UID: "2cc13e31-7024-4112-bddd-c79864a1df19"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.532009 4766 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.532041 4766 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.532052 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.532062 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qncjq\" (UniqueName: \"kubernetes.io/projected/2cc13e31-7024-4112-bddd-c79864a1df19-kube-api-access-qncjq\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.532071 4766 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.532090 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.532101 4766 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.532112 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cc13e31-7024-4112-bddd-c79864a1df19-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.792150 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-nskww" event={"ID":"2cc13e31-7024-4112-bddd-c79864a1df19","Type":"ContainerDied","Data":"fbbe5b4a25c0a6c656ec450beab7a5ec00fd20b048e7d822c358ae5316c3ec30"} Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.792305 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbbe5b4a25c0a6c656ec450beab7a5ec00fd20b048e7d822c358ae5316c3ec30" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.792191 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-nskww" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.932840 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-k6sgx"] Dec 09 05:36:19 crc kubenswrapper[4766]: E1209 05:36:19.933729 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123be68a-5f03-429d-acfe-4d2af434a718" containerName="registry-server" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.933753 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="123be68a-5f03-429d-acfe-4d2af434a718" containerName="registry-server" Dec 09 05:36:19 crc kubenswrapper[4766]: E1209 05:36:19.933773 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123be68a-5f03-429d-acfe-4d2af434a718" containerName="extract-content" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.933779 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="123be68a-5f03-429d-acfe-4d2af434a718" containerName="extract-content" Dec 09 05:36:19 crc kubenswrapper[4766]: E1209 05:36:19.933789 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123be68a-5f03-429d-acfe-4d2af434a718" containerName="extract-utilities" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.933795 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="123be68a-5f03-429d-acfe-4d2af434a718" containerName="extract-utilities" Dec 09 05:36:19 crc kubenswrapper[4766]: E1209 05:36:19.933817 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc13e31-7024-4112-bddd-c79864a1df19" containerName="telemetry-openstack-openstack-cell1" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.933825 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc13e31-7024-4112-bddd-c79864a1df19" containerName="telemetry-openstack-openstack-cell1" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.934060 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="123be68a-5f03-429d-acfe-4d2af434a718" containerName="registry-server" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.934082 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc13e31-7024-4112-bddd-c79864a1df19" containerName="telemetry-openstack-openstack-cell1" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.934865 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.942818 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.943049 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.943313 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.943379 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.943676 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:36:19 crc kubenswrapper[4766]: I1209 05:36:19.949740 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-k6sgx"] Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.042282 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.042405 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfvzl\" (UniqueName: \"kubernetes.io/projected/1992f13d-f848-4cef-a27c-f464d65b48f2-kube-api-access-pfvzl\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.042718 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.042787 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.042824 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.042901 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.145417 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.145492 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfvzl\" (UniqueName: \"kubernetes.io/projected/1992f13d-f848-4cef-a27c-f464d65b48f2-kube-api-access-pfvzl\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.145580 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.145607 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.145631 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.145666 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.149029 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.149695 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.149898 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.149995 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.154815 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.163187 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfvzl\" (UniqueName: \"kubernetes.io/projected/1992f13d-f848-4cef-a27c-f464d65b48f2-kube-api-access-pfvzl\") pod \"neutron-sriov-openstack-openstack-cell1-k6sgx\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.253526 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.918288 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-k6sgx"] Dec 09 05:36:20 crc kubenswrapper[4766]: I1209 05:36:20.935330 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:36:21 crc kubenswrapper[4766]: I1209 05:36:21.813561 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" event={"ID":"1992f13d-f848-4cef-a27c-f464d65b48f2","Type":"ContainerStarted","Data":"6c20cd3b84bb4c8560bc86710616fea7fe46fe8b014fdb6063c6cecc34902278"} Dec 09 05:36:21 crc kubenswrapper[4766]: I1209 05:36:21.813820 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" event={"ID":"1992f13d-f848-4cef-a27c-f464d65b48f2","Type":"ContainerStarted","Data":"f69f747140de55884bd58348dd3d255b9e8c83785cc574d17ec3453921beb47f"} Dec 09 05:36:21 crc kubenswrapper[4766]: I1209 05:36:21.841024 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" podStartSLOduration=2.6174154659999997 podStartE2EDuration="2.841004588s" podCreationTimestamp="2025-12-09 05:36:19 +0000 UTC" firstStartedPulling="2025-12-09 05:36:20.935120726 +0000 UTC m=+8662.644426152" lastFinishedPulling="2025-12-09 05:36:21.158709848 +0000 UTC m=+8662.868015274" observedRunningTime="2025-12-09 05:36:21.830728339 +0000 UTC m=+8663.540033775" watchObservedRunningTime="2025-12-09 05:36:21.841004588 +0000 UTC m=+8663.550310014" Dec 09 05:36:25 crc kubenswrapper[4766]: I1209 05:36:25.839994 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:36:25 crc kubenswrapper[4766]: E1209 05:36:25.841030 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:36:32 crc kubenswrapper[4766]: I1209 05:36:32.969635 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nzwzt"] Dec 09 05:36:32 crc kubenswrapper[4766]: I1209 05:36:32.972442 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:32 crc kubenswrapper[4766]: I1209 05:36:32.983715 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzwzt"] Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.069202 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-catalog-content\") pod \"redhat-operators-nzwzt\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.069369 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s4mx\" (UniqueName: \"kubernetes.io/projected/e7adfe12-2e53-4057-937f-4e6aa2e486a4-kube-api-access-9s4mx\") pod \"redhat-operators-nzwzt\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.069401 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-utilities\") pod \"redhat-operators-nzwzt\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.172062 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-catalog-content\") pod \"redhat-operators-nzwzt\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.172498 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s4mx\" (UniqueName: \"kubernetes.io/projected/e7adfe12-2e53-4057-937f-4e6aa2e486a4-kube-api-access-9s4mx\") pod \"redhat-operators-nzwzt\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.172542 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-utilities\") pod \"redhat-operators-nzwzt\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.172858 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-utilities\") pod \"redhat-operators-nzwzt\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.173025 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-catalog-content\") pod \"redhat-operators-nzwzt\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.203020 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s4mx\" (UniqueName: \"kubernetes.io/projected/e7adfe12-2e53-4057-937f-4e6aa2e486a4-kube-api-access-9s4mx\") pod \"redhat-operators-nzwzt\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.338131 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.837348 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzwzt"] Dec 09 05:36:33 crc kubenswrapper[4766]: I1209 05:36:33.938510 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzwzt" event={"ID":"e7adfe12-2e53-4057-937f-4e6aa2e486a4","Type":"ContainerStarted","Data":"caad334b56ebf144a0e849a7f02587923f3bcc7e404179fa40e21e7117b7b55a"} Dec 09 05:36:34 crc kubenswrapper[4766]: I1209 05:36:34.956271 4766 generic.go:334] "Generic (PLEG): container finished" podID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerID="cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e" exitCode=0 Dec 09 05:36:34 crc kubenswrapper[4766]: I1209 05:36:34.956408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzwzt" event={"ID":"e7adfe12-2e53-4057-937f-4e6aa2e486a4","Type":"ContainerDied","Data":"cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e"} Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.373345 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hggl4"] Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.375774 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.389671 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hggl4"] Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.530421 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-utilities\") pod \"certified-operators-hggl4\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.530477 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhb68\" (UniqueName: \"kubernetes.io/projected/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-kube-api-access-zhb68\") pod \"certified-operators-hggl4\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.530751 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-catalog-content\") pod \"certified-operators-hggl4\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.633122 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-catalog-content\") pod \"certified-operators-hggl4\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.633172 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-utilities\") pod \"certified-operators-hggl4\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.633193 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhb68\" (UniqueName: \"kubernetes.io/projected/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-kube-api-access-zhb68\") pod \"certified-operators-hggl4\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.634588 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-catalog-content\") pod \"certified-operators-hggl4\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.634786 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-utilities\") pod \"certified-operators-hggl4\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.656091 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhb68\" (UniqueName: \"kubernetes.io/projected/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-kube-api-access-zhb68\") pod \"certified-operators-hggl4\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.703062 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.980996 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5fnb2"] Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.983677 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.987091 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzwzt" event={"ID":"e7adfe12-2e53-4057-937f-4e6aa2e486a4","Type":"ContainerStarted","Data":"aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b"} Dec 09 05:36:35 crc kubenswrapper[4766]: I1209 05:36:35.995642 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fnb2"] Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.145901 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzwpn\" (UniqueName: \"kubernetes.io/projected/199b4be9-728e-4590-bb64-5b6d83047ff1-kube-api-access-dzwpn\") pod \"redhat-marketplace-5fnb2\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.146320 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-utilities\") pod \"redhat-marketplace-5fnb2\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.146479 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-catalog-content\") pod \"redhat-marketplace-5fnb2\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.225340 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hggl4"] Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.248679 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzwpn\" (UniqueName: \"kubernetes.io/projected/199b4be9-728e-4590-bb64-5b6d83047ff1-kube-api-access-dzwpn\") pod \"redhat-marketplace-5fnb2\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.248718 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-utilities\") pod \"redhat-marketplace-5fnb2\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.248855 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-catalog-content\") pod \"redhat-marketplace-5fnb2\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.249333 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-catalog-content\") pod \"redhat-marketplace-5fnb2\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.250911 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-utilities\") pod \"redhat-marketplace-5fnb2\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.267706 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzwpn\" (UniqueName: \"kubernetes.io/projected/199b4be9-728e-4590-bb64-5b6d83047ff1-kube-api-access-dzwpn\") pod \"redhat-marketplace-5fnb2\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.307809 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.860508 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fnb2"] Dec 09 05:36:36 crc kubenswrapper[4766]: W1209 05:36:36.877434 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199b4be9_728e_4590_bb64_5b6d83047ff1.slice/crio-40039581e89bd09baed6657d748e0dde3327d4738cdf19e6b93907d53e629a6c WatchSource:0}: Error finding container 40039581e89bd09baed6657d748e0dde3327d4738cdf19e6b93907d53e629a6c: Status 404 returned error can't find the container with id 40039581e89bd09baed6657d748e0dde3327d4738cdf19e6b93907d53e629a6c Dec 09 05:36:36 crc kubenswrapper[4766]: I1209 05:36:36.997603 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fnb2" event={"ID":"199b4be9-728e-4590-bb64-5b6d83047ff1","Type":"ContainerStarted","Data":"40039581e89bd09baed6657d748e0dde3327d4738cdf19e6b93907d53e629a6c"} Dec 09 05:36:37 crc kubenswrapper[4766]: I1209 05:36:36.999046 4766 generic.go:334] "Generic (PLEG): container finished" podID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerID="f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4" exitCode=0 Dec 09 05:36:37 crc kubenswrapper[4766]: I1209 05:36:36.999122 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hggl4" event={"ID":"7c76c906-4d5a-4a84-b66e-8293e7c8afb1","Type":"ContainerDied","Data":"f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4"} Dec 09 05:36:37 crc kubenswrapper[4766]: I1209 05:36:36.999153 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hggl4" event={"ID":"7c76c906-4d5a-4a84-b66e-8293e7c8afb1","Type":"ContainerStarted","Data":"c4dc751cbe1ae4efbbb658a4d369abe361ad9b8696fabf1174d5f5e2313312b0"} Dec 09 05:36:38 crc kubenswrapper[4766]: I1209 05:36:38.850763 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:36:38 crc kubenswrapper[4766]: E1209 05:36:38.851775 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:36:39 crc kubenswrapper[4766]: I1209 05:36:39.038610 4766 generic.go:334] "Generic (PLEG): container finished" podID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerID="40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3" exitCode=0 Dec 09 05:36:39 crc kubenswrapper[4766]: I1209 05:36:39.038721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fnb2" event={"ID":"199b4be9-728e-4590-bb64-5b6d83047ff1","Type":"ContainerDied","Data":"40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3"} Dec 09 05:36:41 crc kubenswrapper[4766]: I1209 05:36:41.058385 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hggl4" event={"ID":"7c76c906-4d5a-4a84-b66e-8293e7c8afb1","Type":"ContainerStarted","Data":"d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68"} Dec 09 05:36:41 crc kubenswrapper[4766]: I1209 05:36:41.060381 4766 generic.go:334] "Generic (PLEG): container finished" podID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerID="aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b" exitCode=0 Dec 09 05:36:41 crc kubenswrapper[4766]: I1209 05:36:41.060420 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzwzt" event={"ID":"e7adfe12-2e53-4057-937f-4e6aa2e486a4","Type":"ContainerDied","Data":"aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b"} Dec 09 05:36:42 crc kubenswrapper[4766]: I1209 05:36:42.076244 4766 generic.go:334] "Generic (PLEG): container finished" podID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerID="d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68" exitCode=0 Dec 09 05:36:42 crc kubenswrapper[4766]: I1209 05:36:42.076337 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hggl4" event={"ID":"7c76c906-4d5a-4a84-b66e-8293e7c8afb1","Type":"ContainerDied","Data":"d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68"} Dec 09 05:36:43 crc kubenswrapper[4766]: I1209 05:36:43.091242 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzwzt" event={"ID":"e7adfe12-2e53-4057-937f-4e6aa2e486a4","Type":"ContainerStarted","Data":"8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50"} Dec 09 05:36:43 crc kubenswrapper[4766]: I1209 05:36:43.095010 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fnb2" event={"ID":"199b4be9-728e-4590-bb64-5b6d83047ff1","Type":"ContainerStarted","Data":"9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1"} Dec 09 05:36:43 crc kubenswrapper[4766]: I1209 05:36:43.121233 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nzwzt" podStartSLOduration=4.017817701 podStartE2EDuration="11.121193253s" podCreationTimestamp="2025-12-09 05:36:32 +0000 UTC" firstStartedPulling="2025-12-09 05:36:34.959303738 +0000 UTC m=+8676.668609194" lastFinishedPulling="2025-12-09 05:36:42.06267932 +0000 UTC m=+8683.771984746" observedRunningTime="2025-12-09 05:36:43.116205398 +0000 UTC m=+8684.825510844" watchObservedRunningTime="2025-12-09 05:36:43.121193253 +0000 UTC m=+8684.830498679" Dec 09 05:36:43 crc kubenswrapper[4766]: I1209 05:36:43.339092 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:43 crc kubenswrapper[4766]: I1209 05:36:43.339164 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:36:44 crc kubenswrapper[4766]: I1209 05:36:44.113986 4766 generic.go:334] "Generic (PLEG): container finished" podID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerID="9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1" exitCode=0 Dec 09 05:36:44 crc kubenswrapper[4766]: I1209 05:36:44.114382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fnb2" event={"ID":"199b4be9-728e-4590-bb64-5b6d83047ff1","Type":"ContainerDied","Data":"9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1"} Dec 09 05:36:44 crc kubenswrapper[4766]: I1209 05:36:44.122095 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hggl4" event={"ID":"7c76c906-4d5a-4a84-b66e-8293e7c8afb1","Type":"ContainerStarted","Data":"d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef"} Dec 09 05:36:44 crc kubenswrapper[4766]: I1209 05:36:44.169599 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hggl4" podStartSLOduration=3.5413012200000002 podStartE2EDuration="9.169577462s" podCreationTimestamp="2025-12-09 05:36:35 +0000 UTC" firstStartedPulling="2025-12-09 05:36:37.001048366 +0000 UTC m=+8678.710353812" lastFinishedPulling="2025-12-09 05:36:42.629324628 +0000 UTC m=+8684.338630054" observedRunningTime="2025-12-09 05:36:44.15807571 +0000 UTC m=+8685.867381146" watchObservedRunningTime="2025-12-09 05:36:44.169577462 +0000 UTC m=+8685.878882908" Dec 09 05:36:44 crc kubenswrapper[4766]: I1209 05:36:44.395801 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nzwzt" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="registry-server" probeResult="failure" output=< Dec 09 05:36:44 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:36:44 crc kubenswrapper[4766]: > Dec 09 05:36:45 crc kubenswrapper[4766]: I1209 05:36:45.141997 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fnb2" event={"ID":"199b4be9-728e-4590-bb64-5b6d83047ff1","Type":"ContainerStarted","Data":"9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2"} Dec 09 05:36:45 crc kubenswrapper[4766]: I1209 05:36:45.165522 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5fnb2" podStartSLOduration=4.776322961 podStartE2EDuration="10.165495191s" podCreationTimestamp="2025-12-09 05:36:35 +0000 UTC" firstStartedPulling="2025-12-09 05:36:39.098549974 +0000 UTC m=+8680.807855400" lastFinishedPulling="2025-12-09 05:36:44.487722214 +0000 UTC m=+8686.197027630" observedRunningTime="2025-12-09 05:36:45.16028451 +0000 UTC m=+8686.869589946" watchObservedRunningTime="2025-12-09 05:36:45.165495191 +0000 UTC m=+8686.874800617" Dec 09 05:36:45 crc kubenswrapper[4766]: I1209 05:36:45.704275 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:45 crc kubenswrapper[4766]: I1209 05:36:45.704846 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:46 crc kubenswrapper[4766]: I1209 05:36:46.308225 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:46 crc kubenswrapper[4766]: I1209 05:36:46.309422 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:46 crc kubenswrapper[4766]: I1209 05:36:46.758620 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hggl4" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerName="registry-server" probeResult="failure" output=< Dec 09 05:36:46 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:36:46 crc kubenswrapper[4766]: > Dec 09 05:36:47 crc kubenswrapper[4766]: I1209 05:36:47.380397 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5fnb2" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerName="registry-server" probeResult="failure" output=< Dec 09 05:36:47 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:36:47 crc kubenswrapper[4766]: > Dec 09 05:36:51 crc kubenswrapper[4766]: I1209 05:36:51.839348 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:36:51 crc kubenswrapper[4766]: E1209 05:36:51.840092 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:36:54 crc kubenswrapper[4766]: I1209 05:36:54.393064 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nzwzt" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="registry-server" probeResult="failure" output=< Dec 09 05:36:54 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:36:54 crc kubenswrapper[4766]: > Dec 09 05:36:55 crc kubenswrapper[4766]: I1209 05:36:55.754453 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:55 crc kubenswrapper[4766]: I1209 05:36:55.804745 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:55 crc kubenswrapper[4766]: I1209 05:36:55.991481 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hggl4"] Dec 09 05:36:56 crc kubenswrapper[4766]: I1209 05:36:56.395419 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:56 crc kubenswrapper[4766]: I1209 05:36:56.470049 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.284807 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hggl4" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerName="registry-server" containerID="cri-o://d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef" gracePeriod=2 Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.757957 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.855861 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-utilities\") pod \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.856149 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhb68\" (UniqueName: \"kubernetes.io/projected/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-kube-api-access-zhb68\") pod \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.856689 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-utilities" (OuterVolumeSpecName: "utilities") pod "7c76c906-4d5a-4a84-b66e-8293e7c8afb1" (UID: "7c76c906-4d5a-4a84-b66e-8293e7c8afb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.857291 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-catalog-content\") pod \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\" (UID: \"7c76c906-4d5a-4a84-b66e-8293e7c8afb1\") " Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.858612 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.875522 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-kube-api-access-zhb68" (OuterVolumeSpecName: "kube-api-access-zhb68") pod "7c76c906-4d5a-4a84-b66e-8293e7c8afb1" (UID: "7c76c906-4d5a-4a84-b66e-8293e7c8afb1"). InnerVolumeSpecName "kube-api-access-zhb68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.897396 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c76c906-4d5a-4a84-b66e-8293e7c8afb1" (UID: "7c76c906-4d5a-4a84-b66e-8293e7c8afb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.959575 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhb68\" (UniqueName: \"kubernetes.io/projected/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-kube-api-access-zhb68\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:57 crc kubenswrapper[4766]: I1209 05:36:57.959610 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c76c906-4d5a-4a84-b66e-8293e7c8afb1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.301309 4766 generic.go:334] "Generic (PLEG): container finished" podID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerID="d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef" exitCode=0 Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.301362 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hggl4" event={"ID":"7c76c906-4d5a-4a84-b66e-8293e7c8afb1","Type":"ContainerDied","Data":"d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef"} Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.301408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hggl4" event={"ID":"7c76c906-4d5a-4a84-b66e-8293e7c8afb1","Type":"ContainerDied","Data":"c4dc751cbe1ae4efbbb658a4d369abe361ad9b8696fabf1174d5f5e2313312b0"} Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.301431 4766 scope.go:117] "RemoveContainer" containerID="d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.301432 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hggl4" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.328710 4766 scope.go:117] "RemoveContainer" containerID="d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.359168 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hggl4"] Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.373161 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hggl4"] Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.383357 4766 scope.go:117] "RemoveContainer" containerID="f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.408906 4766 scope.go:117] "RemoveContainer" containerID="d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef" Dec 09 05:36:58 crc kubenswrapper[4766]: E1209 05:36:58.409525 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef\": container with ID starting with d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef not found: ID does not exist" containerID="d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.409575 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef"} err="failed to get container status \"d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef\": rpc error: code = NotFound desc = could not find container \"d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef\": container with ID starting with d5376d0117c3d61a6639e651f6075fcbaaa6941622e96f96ff230e9809e22fef not found: ID does not exist" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.409609 4766 scope.go:117] "RemoveContainer" containerID="d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68" Dec 09 05:36:58 crc kubenswrapper[4766]: E1209 05:36:58.410084 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68\": container with ID starting with d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68 not found: ID does not exist" containerID="d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.410157 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68"} err="failed to get container status \"d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68\": rpc error: code = NotFound desc = could not find container \"d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68\": container with ID starting with d4bb0585eb9a1330496bb68a6b5b5a12f8182cb321d0558f8bf462b088e75e68 not found: ID does not exist" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.410200 4766 scope.go:117] "RemoveContainer" containerID="f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4" Dec 09 05:36:58 crc kubenswrapper[4766]: E1209 05:36:58.411053 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4\": container with ID starting with f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4 not found: ID does not exist" containerID="f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.411095 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4"} err="failed to get container status \"f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4\": rpc error: code = NotFound desc = could not find container \"f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4\": container with ID starting with f3fdc436b6635d1bceaacea045a02d293ab8fd4f6a445dc847b60508c4065fd4 not found: ID does not exist" Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.791506 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fnb2"] Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.791821 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5fnb2" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerName="registry-server" containerID="cri-o://9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2" gracePeriod=2 Dec 09 05:36:58 crc kubenswrapper[4766]: I1209 05:36:58.851622 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" path="/var/lib/kubelet/pods/7c76c906-4d5a-4a84-b66e-8293e7c8afb1/volumes" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.283367 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.314352 4766 generic.go:334] "Generic (PLEG): container finished" podID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerID="9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2" exitCode=0 Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.314402 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fnb2" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.314407 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fnb2" event={"ID":"199b4be9-728e-4590-bb64-5b6d83047ff1","Type":"ContainerDied","Data":"9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2"} Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.314469 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fnb2" event={"ID":"199b4be9-728e-4590-bb64-5b6d83047ff1","Type":"ContainerDied","Data":"40039581e89bd09baed6657d748e0dde3327d4738cdf19e6b93907d53e629a6c"} Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.314487 4766 scope.go:117] "RemoveContainer" containerID="9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.343661 4766 scope.go:117] "RemoveContainer" containerID="9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.372657 4766 scope.go:117] "RemoveContainer" containerID="40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.390156 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-catalog-content\") pod \"199b4be9-728e-4590-bb64-5b6d83047ff1\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.390350 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-utilities\") pod \"199b4be9-728e-4590-bb64-5b6d83047ff1\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.390553 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzwpn\" (UniqueName: \"kubernetes.io/projected/199b4be9-728e-4590-bb64-5b6d83047ff1-kube-api-access-dzwpn\") pod \"199b4be9-728e-4590-bb64-5b6d83047ff1\" (UID: \"199b4be9-728e-4590-bb64-5b6d83047ff1\") " Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.391079 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-utilities" (OuterVolumeSpecName: "utilities") pod "199b4be9-728e-4590-bb64-5b6d83047ff1" (UID: "199b4be9-728e-4590-bb64-5b6d83047ff1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.392440 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.396196 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199b4be9-728e-4590-bb64-5b6d83047ff1-kube-api-access-dzwpn" (OuterVolumeSpecName: "kube-api-access-dzwpn") pod "199b4be9-728e-4590-bb64-5b6d83047ff1" (UID: "199b4be9-728e-4590-bb64-5b6d83047ff1"). InnerVolumeSpecName "kube-api-access-dzwpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.410519 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "199b4be9-728e-4590-bb64-5b6d83047ff1" (UID: "199b4be9-728e-4590-bb64-5b6d83047ff1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.467349 4766 scope.go:117] "RemoveContainer" containerID="9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2" Dec 09 05:36:59 crc kubenswrapper[4766]: E1209 05:36:59.467867 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2\": container with ID starting with 9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2 not found: ID does not exist" containerID="9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.467915 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2"} err="failed to get container status \"9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2\": rpc error: code = NotFound desc = could not find container \"9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2\": container with ID starting with 9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2 not found: ID does not exist" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.467948 4766 scope.go:117] "RemoveContainer" containerID="9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1" Dec 09 05:36:59 crc kubenswrapper[4766]: E1209 05:36:59.468400 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1\": container with ID starting with 9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1 not found: ID does not exist" containerID="9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.468432 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1"} err="failed to get container status \"9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1\": rpc error: code = NotFound desc = could not find container \"9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1\": container with ID starting with 9e675aa57c97cf698e995341bb6fe79ee4c18e749905db386a7f35f7c0c3d3f1 not found: ID does not exist" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.468457 4766 scope.go:117] "RemoveContainer" containerID="40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3" Dec 09 05:36:59 crc kubenswrapper[4766]: E1209 05:36:59.468862 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3\": container with ID starting with 40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3 not found: ID does not exist" containerID="40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.468890 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3"} err="failed to get container status \"40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3\": rpc error: code = NotFound desc = could not find container \"40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3\": container with ID starting with 40e4046b335750613e865baab6c91fbb0742e3cb6106e28fde2eb0137ef2b2a3 not found: ID does not exist" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.495408 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzwpn\" (UniqueName: \"kubernetes.io/projected/199b4be9-728e-4590-bb64-5b6d83047ff1-kube-api-access-dzwpn\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.495455 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/199b4be9-728e-4590-bb64-5b6d83047ff1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.657451 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fnb2"] Dec 09 05:36:59 crc kubenswrapper[4766]: I1209 05:36:59.666493 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fnb2"] Dec 09 05:37:00 crc kubenswrapper[4766]: I1209 05:37:00.852836 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" path="/var/lib/kubelet/pods/199b4be9-728e-4590-bb64-5b6d83047ff1/volumes" Dec 09 05:37:03 crc kubenswrapper[4766]: I1209 05:37:03.397526 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:37:03 crc kubenswrapper[4766]: I1209 05:37:03.452431 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:37:03 crc kubenswrapper[4766]: I1209 05:37:03.839392 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:37:03 crc kubenswrapper[4766]: E1209 05:37:03.839678 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:37:04 crc kubenswrapper[4766]: I1209 05:37:04.590750 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzwzt"] Dec 09 05:37:05 crc kubenswrapper[4766]: I1209 05:37:05.387825 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nzwzt" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="registry-server" containerID="cri-o://8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50" gracePeriod=2 Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.401734 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.405473 4766 generic.go:334] "Generic (PLEG): container finished" podID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerID="8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50" exitCode=0 Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.405513 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzwzt" event={"ID":"e7adfe12-2e53-4057-937f-4e6aa2e486a4","Type":"ContainerDied","Data":"8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50"} Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.405569 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzwzt" event={"ID":"e7adfe12-2e53-4057-937f-4e6aa2e486a4","Type":"ContainerDied","Data":"caad334b56ebf144a0e849a7f02587923f3bcc7e404179fa40e21e7117b7b55a"} Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.405593 4766 scope.go:117] "RemoveContainer" containerID="8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.450642 4766 scope.go:117] "RemoveContainer" containerID="aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.469226 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s4mx\" (UniqueName: \"kubernetes.io/projected/e7adfe12-2e53-4057-937f-4e6aa2e486a4-kube-api-access-9s4mx\") pod \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.469358 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-utilities\") pod \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.469412 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-catalog-content\") pod \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\" (UID: \"e7adfe12-2e53-4057-937f-4e6aa2e486a4\") " Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.477332 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-utilities" (OuterVolumeSpecName: "utilities") pod "e7adfe12-2e53-4057-937f-4e6aa2e486a4" (UID: "e7adfe12-2e53-4057-937f-4e6aa2e486a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.480723 4766 scope.go:117] "RemoveContainer" containerID="cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.486462 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7adfe12-2e53-4057-937f-4e6aa2e486a4-kube-api-access-9s4mx" (OuterVolumeSpecName: "kube-api-access-9s4mx") pod "e7adfe12-2e53-4057-937f-4e6aa2e486a4" (UID: "e7adfe12-2e53-4057-937f-4e6aa2e486a4"). InnerVolumeSpecName "kube-api-access-9s4mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.565820 4766 scope.go:117] "RemoveContainer" containerID="8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50" Dec 09 05:37:06 crc kubenswrapper[4766]: E1209 05:37:06.568692 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50\": container with ID starting with 8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50 not found: ID does not exist" containerID="8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.568752 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50"} err="failed to get container status \"8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50\": rpc error: code = NotFound desc = could not find container \"8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50\": container with ID starting with 8828638871f9cb4d6e752951a5bac5bd6e0610cb7519ad4ae4730662d6292e50 not found: ID does not exist" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.568782 4766 scope.go:117] "RemoveContainer" containerID="aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b" Dec 09 05:37:06 crc kubenswrapper[4766]: E1209 05:37:06.569126 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b\": container with ID starting with aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b not found: ID does not exist" containerID="aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.569168 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b"} err="failed to get container status \"aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b\": rpc error: code = NotFound desc = could not find container \"aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b\": container with ID starting with aaebc572a472bb82da417ff4c36c178687282532d0c0cedf4a6ca8db36e44e6b not found: ID does not exist" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.569194 4766 scope.go:117] "RemoveContainer" containerID="cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e" Dec 09 05:37:06 crc kubenswrapper[4766]: E1209 05:37:06.569470 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e\": container with ID starting with cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e not found: ID does not exist" containerID="cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.569520 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e"} err="failed to get container status \"cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e\": rpc error: code = NotFound desc = could not find container \"cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e\": container with ID starting with cfee206bba61c816329478191f05343e034ed7e2a826830500ad4b473b52512e not found: ID does not exist" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.572156 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s4mx\" (UniqueName: \"kubernetes.io/projected/e7adfe12-2e53-4057-937f-4e6aa2e486a4-kube-api-access-9s4mx\") on node \"crc\" DevicePath \"\"" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.572189 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.585618 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7adfe12-2e53-4057-937f-4e6aa2e486a4" (UID: "e7adfe12-2e53-4057-937f-4e6aa2e486a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:37:06 crc kubenswrapper[4766]: I1209 05:37:06.673942 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7adfe12-2e53-4057-937f-4e6aa2e486a4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:37:07 crc kubenswrapper[4766]: I1209 05:37:07.421805 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzwzt" Dec 09 05:37:07 crc kubenswrapper[4766]: E1209 05:37:07.428817 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199b4be9_728e_4590_bb64_5b6d83047ff1.slice/crio-conmon-9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2.scope\": RecentStats: unable to find data in memory cache]" Dec 09 05:37:07 crc kubenswrapper[4766]: I1209 05:37:07.450617 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzwzt"] Dec 09 05:37:07 crc kubenswrapper[4766]: I1209 05:37:07.463336 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nzwzt"] Dec 09 05:37:08 crc kubenswrapper[4766]: I1209 05:37:08.876584 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" path="/var/lib/kubelet/pods/e7adfe12-2e53-4057-937f-4e6aa2e486a4/volumes" Dec 09 05:37:14 crc kubenswrapper[4766]: I1209 05:37:14.840398 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:37:14 crc kubenswrapper[4766]: E1209 05:37:14.841323 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:37:17 crc kubenswrapper[4766]: E1209 05:37:17.747743 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199b4be9_728e_4590_bb64_5b6d83047ff1.slice/crio-conmon-9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2.scope\": RecentStats: unable to find data in memory cache]" Dec 09 05:37:25 crc kubenswrapper[4766]: I1209 05:37:25.840532 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:37:25 crc kubenswrapper[4766]: E1209 05:37:25.841769 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:37:28 crc kubenswrapper[4766]: E1209 05:37:28.080017 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199b4be9_728e_4590_bb64_5b6d83047ff1.slice/crio-conmon-9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2.scope\": RecentStats: unable to find data in memory cache]" Dec 09 05:37:36 crc kubenswrapper[4766]: I1209 05:37:36.840357 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:37:36 crc kubenswrapper[4766]: E1209 05:37:36.841807 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:37:38 crc kubenswrapper[4766]: E1209 05:37:38.397730 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199b4be9_728e_4590_bb64_5b6d83047ff1.slice/crio-conmon-9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2.scope\": RecentStats: unable to find data in memory cache]" Dec 09 05:37:47 crc kubenswrapper[4766]: I1209 05:37:47.839926 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:37:47 crc kubenswrapper[4766]: E1209 05:37:47.840707 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:37:48 crc kubenswrapper[4766]: E1209 05:37:48.681376 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199b4be9_728e_4590_bb64_5b6d83047ff1.slice/crio-conmon-9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2.scope\": RecentStats: unable to find data in memory cache]" Dec 09 05:37:58 crc kubenswrapper[4766]: E1209 05:37:58.974727 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199b4be9_728e_4590_bb64_5b6d83047ff1.slice/crio-conmon-9ca2e35f2a704ed224cec539b6d96cbe3734f18eec14e719901b6785c1ab03d2.scope\": RecentStats: unable to find data in memory cache]" Dec 09 05:37:59 crc kubenswrapper[4766]: I1209 05:37:59.839716 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:37:59 crc kubenswrapper[4766]: E1209 05:37:59.840347 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:38:11 crc kubenswrapper[4766]: I1209 05:38:11.839321 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:38:12 crc kubenswrapper[4766]: I1209 05:38:12.121041 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"3173c583d13d2e08ce38c6221db09bafc0bee8b88892615645184ce7426e6265"} Dec 09 05:39:50 crc kubenswrapper[4766]: I1209 05:39:50.204000 4766 generic.go:334] "Generic (PLEG): container finished" podID="1992f13d-f848-4cef-a27c-f464d65b48f2" containerID="6c20cd3b84bb4c8560bc86710616fea7fe46fe8b014fdb6063c6cecc34902278" exitCode=0 Dec 09 05:39:50 crc kubenswrapper[4766]: I1209 05:39:50.204100 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" event={"ID":"1992f13d-f848-4cef-a27c-f464d65b48f2","Type":"ContainerDied","Data":"6c20cd3b84bb4c8560bc86710616fea7fe46fe8b014fdb6063c6cecc34902278"} Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.663404 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.787390 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ssh-key\") pod \"1992f13d-f848-4cef-a27c-f464d65b48f2\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.787501 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfvzl\" (UniqueName: \"kubernetes.io/projected/1992f13d-f848-4cef-a27c-f464d65b48f2-kube-api-access-pfvzl\") pod \"1992f13d-f848-4cef-a27c-f464d65b48f2\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.787706 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-inventory\") pod \"1992f13d-f848-4cef-a27c-f464d65b48f2\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.787782 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-agent-neutron-config-0\") pod \"1992f13d-f848-4cef-a27c-f464d65b48f2\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.787819 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-combined-ca-bundle\") pod \"1992f13d-f848-4cef-a27c-f464d65b48f2\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.787870 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ceph\") pod \"1992f13d-f848-4cef-a27c-f464d65b48f2\" (UID: \"1992f13d-f848-4cef-a27c-f464d65b48f2\") " Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.793832 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ceph" (OuterVolumeSpecName: "ceph") pod "1992f13d-f848-4cef-a27c-f464d65b48f2" (UID: "1992f13d-f848-4cef-a27c-f464d65b48f2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.793862 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "1992f13d-f848-4cef-a27c-f464d65b48f2" (UID: "1992f13d-f848-4cef-a27c-f464d65b48f2"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.794568 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1992f13d-f848-4cef-a27c-f464d65b48f2-kube-api-access-pfvzl" (OuterVolumeSpecName: "kube-api-access-pfvzl") pod "1992f13d-f848-4cef-a27c-f464d65b48f2" (UID: "1992f13d-f848-4cef-a27c-f464d65b48f2"). InnerVolumeSpecName "kube-api-access-pfvzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.816826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-inventory" (OuterVolumeSpecName: "inventory") pod "1992f13d-f848-4cef-a27c-f464d65b48f2" (UID: "1992f13d-f848-4cef-a27c-f464d65b48f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.816874 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1992f13d-f848-4cef-a27c-f464d65b48f2" (UID: "1992f13d-f848-4cef-a27c-f464d65b48f2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.821851 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "1992f13d-f848-4cef-a27c-f464d65b48f2" (UID: "1992f13d-f848-4cef-a27c-f464d65b48f2"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.891288 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.891347 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.891360 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.891372 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.891402 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1992f13d-f848-4cef-a27c-f464d65b48f2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:39:51 crc kubenswrapper[4766]: I1209 05:39:51.891412 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfvzl\" (UniqueName: \"kubernetes.io/projected/1992f13d-f848-4cef-a27c-f464d65b48f2-kube-api-access-pfvzl\") on node \"crc\" DevicePath \"\"" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.230196 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" event={"ID":"1992f13d-f848-4cef-a27c-f464d65b48f2","Type":"ContainerDied","Data":"f69f747140de55884bd58348dd3d255b9e8c83785cc574d17ec3453921beb47f"} Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.230541 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f69f747140de55884bd58348dd3d255b9e8c83785cc574d17ec3453921beb47f" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.230610 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-k6sgx" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.326652 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-cpxph"] Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327200 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="extract-content" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327240 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="extract-content" Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327264 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1992f13d-f848-4cef-a27c-f464d65b48f2" containerName="neutron-sriov-openstack-openstack-cell1" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327271 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1992f13d-f848-4cef-a27c-f464d65b48f2" containerName="neutron-sriov-openstack-openstack-cell1" Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327284 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerName="registry-server" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327291 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerName="registry-server" Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327322 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="extract-utilities" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327328 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="extract-utilities" Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327335 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerName="extract-content" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327341 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerName="extract-content" Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327352 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerName="extract-content" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327358 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerName="extract-content" Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327367 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerName="registry-server" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327373 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerName="registry-server" Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327395 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="registry-server" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327400 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="registry-server" Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327410 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerName="extract-utilities" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327415 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerName="extract-utilities" Dec 09 05:39:52 crc kubenswrapper[4766]: E1209 05:39:52.327431 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerName="extract-utilities" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327438 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerName="extract-utilities" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327627 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c76c906-4d5a-4a84-b66e-8293e7c8afb1" containerName="registry-server" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327642 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1992f13d-f848-4cef-a27c-f464d65b48f2" containerName="neutron-sriov-openstack-openstack-cell1" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327651 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7adfe12-2e53-4057-937f-4e6aa2e486a4" containerName="registry-server" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.327670 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="199b4be9-728e-4590-bb64-5b6d83047ff1" containerName="registry-server" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.328483 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.331575 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.331782 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.331987 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.331999 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.332277 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.339687 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-cpxph"] Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.502977 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j274t\" (UniqueName: \"kubernetes.io/projected/c04eb3ac-c0d3-470a-a741-3e8369270d60-kube-api-access-j274t\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.503033 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.503163 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.503332 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.503359 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.503384 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.605613 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.605796 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.605825 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.605846 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.605905 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j274t\" (UniqueName: \"kubernetes.io/projected/c04eb3ac-c0d3-470a-a741-3e8369270d60-kube-api-access-j274t\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.605933 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.611869 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.612726 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.613158 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.616342 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.620167 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.638141 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j274t\" (UniqueName: \"kubernetes.io/projected/c04eb3ac-c0d3-470a-a741-3e8369270d60-kube-api-access-j274t\") pod \"neutron-dhcp-openstack-openstack-cell1-cpxph\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:52 crc kubenswrapper[4766]: I1209 05:39:52.647015 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:39:53 crc kubenswrapper[4766]: I1209 05:39:53.238101 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-cpxph"] Dec 09 05:39:54 crc kubenswrapper[4766]: I1209 05:39:54.252064 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" event={"ID":"c04eb3ac-c0d3-470a-a741-3e8369270d60","Type":"ContainerStarted","Data":"668ebea138bb2cad1eef676052b6c593809d0f1fecc6a2c209a16dca5fa89899"} Dec 09 05:39:54 crc kubenswrapper[4766]: I1209 05:39:54.252409 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" event={"ID":"c04eb3ac-c0d3-470a-a741-3e8369270d60","Type":"ContainerStarted","Data":"8722efb2ebda0ffd55b1367a948134e6dc88b5fda05eb16761eddf0ec6770f23"} Dec 09 05:39:54 crc kubenswrapper[4766]: I1209 05:39:54.276454 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" podStartSLOduration=2.083650318 podStartE2EDuration="2.276429636s" podCreationTimestamp="2025-12-09 05:39:52 +0000 UTC" firstStartedPulling="2025-12-09 05:39:53.561846053 +0000 UTC m=+8875.271151499" lastFinishedPulling="2025-12-09 05:39:53.754625401 +0000 UTC m=+8875.463930817" observedRunningTime="2025-12-09 05:39:54.270087905 +0000 UTC m=+8875.979393351" watchObservedRunningTime="2025-12-09 05:39:54.276429636 +0000 UTC m=+8875.985735062" Dec 09 05:40:37 crc kubenswrapper[4766]: I1209 05:40:37.316678 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:40:37 crc kubenswrapper[4766]: I1209 05:40:37.317285 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:41:07 crc kubenswrapper[4766]: I1209 05:41:07.317368 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:41:07 crc kubenswrapper[4766]: I1209 05:41:07.318043 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:41:37 crc kubenswrapper[4766]: I1209 05:41:37.316396 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:41:37 crc kubenswrapper[4766]: I1209 05:41:37.316955 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:41:37 crc kubenswrapper[4766]: I1209 05:41:37.317000 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:41:37 crc kubenswrapper[4766]: I1209 05:41:37.317871 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3173c583d13d2e08ce38c6221db09bafc0bee8b88892615645184ce7426e6265"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:41:37 crc kubenswrapper[4766]: I1209 05:41:37.317929 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://3173c583d13d2e08ce38c6221db09bafc0bee8b88892615645184ce7426e6265" gracePeriod=600 Dec 09 05:41:38 crc kubenswrapper[4766]: I1209 05:41:38.369930 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="3173c583d13d2e08ce38c6221db09bafc0bee8b88892615645184ce7426e6265" exitCode=0 Dec 09 05:41:38 crc kubenswrapper[4766]: I1209 05:41:38.370009 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"3173c583d13d2e08ce38c6221db09bafc0bee8b88892615645184ce7426e6265"} Dec 09 05:41:38 crc kubenswrapper[4766]: I1209 05:41:38.371868 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a"} Dec 09 05:41:38 crc kubenswrapper[4766]: I1209 05:41:38.371964 4766 scope.go:117] "RemoveContainer" containerID="5bce8326717358496a27af5496c03ca50c96a5a7ba1a1bec6cc095df2889a89f" Dec 09 05:43:20 crc kubenswrapper[4766]: I1209 05:43:20.517889 4766 generic.go:334] "Generic (PLEG): container finished" podID="c04eb3ac-c0d3-470a-a741-3e8369270d60" containerID="668ebea138bb2cad1eef676052b6c593809d0f1fecc6a2c209a16dca5fa89899" exitCode=0 Dec 09 05:43:20 crc kubenswrapper[4766]: I1209 05:43:20.517981 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" event={"ID":"c04eb3ac-c0d3-470a-a741-3e8369270d60","Type":"ContainerDied","Data":"668ebea138bb2cad1eef676052b6c593809d0f1fecc6a2c209a16dca5fa89899"} Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.001385 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.144108 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-agent-neutron-config-0\") pod \"c04eb3ac-c0d3-470a-a741-3e8369270d60\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.144169 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ceph\") pod \"c04eb3ac-c0d3-470a-a741-3e8369270d60\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.144229 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j274t\" (UniqueName: \"kubernetes.io/projected/c04eb3ac-c0d3-470a-a741-3e8369270d60-kube-api-access-j274t\") pod \"c04eb3ac-c0d3-470a-a741-3e8369270d60\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.144296 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-inventory\") pod \"c04eb3ac-c0d3-470a-a741-3e8369270d60\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.144411 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-combined-ca-bundle\") pod \"c04eb3ac-c0d3-470a-a741-3e8369270d60\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.144490 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ssh-key\") pod \"c04eb3ac-c0d3-470a-a741-3e8369270d60\" (UID: \"c04eb3ac-c0d3-470a-a741-3e8369270d60\") " Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.152067 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c04eb3ac-c0d3-470a-a741-3e8369270d60-kube-api-access-j274t" (OuterVolumeSpecName: "kube-api-access-j274t") pod "c04eb3ac-c0d3-470a-a741-3e8369270d60" (UID: "c04eb3ac-c0d3-470a-a741-3e8369270d60"). InnerVolumeSpecName "kube-api-access-j274t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.152454 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ceph" (OuterVolumeSpecName: "ceph") pod "c04eb3ac-c0d3-470a-a741-3e8369270d60" (UID: "c04eb3ac-c0d3-470a-a741-3e8369270d60"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.153425 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "c04eb3ac-c0d3-470a-a741-3e8369270d60" (UID: "c04eb3ac-c0d3-470a-a741-3e8369270d60"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.175820 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "c04eb3ac-c0d3-470a-a741-3e8369270d60" (UID: "c04eb3ac-c0d3-470a-a741-3e8369270d60"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.175937 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-inventory" (OuterVolumeSpecName: "inventory") pod "c04eb3ac-c0d3-470a-a741-3e8369270d60" (UID: "c04eb3ac-c0d3-470a-a741-3e8369270d60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.180479 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c04eb3ac-c0d3-470a-a741-3e8369270d60" (UID: "c04eb3ac-c0d3-470a-a741-3e8369270d60"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.248299 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.248335 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.248351 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j274t\" (UniqueName: \"kubernetes.io/projected/c04eb3ac-c0d3-470a-a741-3e8369270d60-kube-api-access-j274t\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.248378 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.248387 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.248396 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c04eb3ac-c0d3-470a-a741-3e8369270d60-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.539357 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" event={"ID":"c04eb3ac-c0d3-470a-a741-3e8369270d60","Type":"ContainerDied","Data":"8722efb2ebda0ffd55b1367a948134e6dc88b5fda05eb16761eddf0ec6770f23"} Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.539404 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8722efb2ebda0ffd55b1367a948134e6dc88b5fda05eb16761eddf0ec6770f23" Dec 09 05:43:22 crc kubenswrapper[4766]: I1209 05:43:22.539423 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-cpxph" Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.059056 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.059777 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="0fd44249-b949-4f58-9973-01c7d3494dcc" containerName="nova-cell0-conductor-conductor" containerID="cri-o://97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c" gracePeriod=30 Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.530646 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.531109 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="67b8b4b8-be77-4988-b2a6-5f2c5e21d874" containerName="nova-cell1-conductor-conductor" containerID="cri-o://46bac6e0959b2e7c1cb812308bc7f3c437527d4b8e8da8360281f1fa3c9e6784" gracePeriod=30 Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.722588 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.729126 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c33cb8e9-0109-4f61-9441-c3a99463eaf4" containerName="nova-scheduler-scheduler" containerID="cri-o://7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf" gracePeriod=30 Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.731677 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.732024 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-log" containerID="cri-o://81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9" gracePeriod=30 Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.732241 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-api" containerID="cri-o://3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554" gracePeriod=30 Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.745125 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.745397 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-log" containerID="cri-o://26b07fb1176840683bbf295eaff4294fda9b24682a4f02345af89cf54f1d95b8" gracePeriod=30 Dec 09 05:43:29 crc kubenswrapper[4766]: I1209 05:43:29.746161 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-metadata" containerID="cri-o://4cb91fd0afc5b488372a9a94bb30813d7d6976c95137e2089670348d94b43133" gracePeriod=30 Dec 09 05:43:29 crc kubenswrapper[4766]: E1209 05:43:29.852947 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 05:43:29 crc kubenswrapper[4766]: E1209 05:43:29.854422 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 05:43:29 crc kubenswrapper[4766]: E1209 05:43:29.855513 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 09 05:43:29 crc kubenswrapper[4766]: E1209 05:43:29.855579 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="0fd44249-b949-4f58-9973-01c7d3494dcc" containerName="nova-cell0-conductor-conductor" Dec 09 05:43:30 crc kubenswrapper[4766]: I1209 05:43:30.629898 4766 generic.go:334] "Generic (PLEG): container finished" podID="67b8b4b8-be77-4988-b2a6-5f2c5e21d874" containerID="46bac6e0959b2e7c1cb812308bc7f3c437527d4b8e8da8360281f1fa3c9e6784" exitCode=0 Dec 09 05:43:30 crc kubenswrapper[4766]: I1209 05:43:30.630284 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"67b8b4b8-be77-4988-b2a6-5f2c5e21d874","Type":"ContainerDied","Data":"46bac6e0959b2e7c1cb812308bc7f3c437527d4b8e8da8360281f1fa3c9e6784"} Dec 09 05:43:30 crc kubenswrapper[4766]: I1209 05:43:30.632943 4766 generic.go:334] "Generic (PLEG): container finished" podID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerID="81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9" exitCode=143 Dec 09 05:43:30 crc kubenswrapper[4766]: I1209 05:43:30.633006 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30ee8f60-34a9-4f4f-8deb-324d8bfd2405","Type":"ContainerDied","Data":"81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9"} Dec 09 05:43:30 crc kubenswrapper[4766]: I1209 05:43:30.635142 4766 generic.go:334] "Generic (PLEG): container finished" podID="63d5b763-8225-4cac-bdde-0002c06ed154" containerID="26b07fb1176840683bbf295eaff4294fda9b24682a4f02345af89cf54f1d95b8" exitCode=143 Dec 09 05:43:30 crc kubenswrapper[4766]: I1209 05:43:30.635173 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63d5b763-8225-4cac-bdde-0002c06ed154","Type":"ContainerDied","Data":"26b07fb1176840683bbf295eaff4294fda9b24682a4f02345af89cf54f1d95b8"} Dec 09 05:43:30 crc kubenswrapper[4766]: I1209 05:43:30.943020 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.060833 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-combined-ca-bundle\") pod \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.060920 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-config-data\") pod \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.060996 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4dsq\" (UniqueName: \"kubernetes.io/projected/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-kube-api-access-s4dsq\") pod \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\" (UID: \"67b8b4b8-be77-4988-b2a6-5f2c5e21d874\") " Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.103456 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-kube-api-access-s4dsq" (OuterVolumeSpecName: "kube-api-access-s4dsq") pod "67b8b4b8-be77-4988-b2a6-5f2c5e21d874" (UID: "67b8b4b8-be77-4988-b2a6-5f2c5e21d874"). InnerVolumeSpecName "kube-api-access-s4dsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.137795 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-config-data" (OuterVolumeSpecName: "config-data") pod "67b8b4b8-be77-4988-b2a6-5f2c5e21d874" (UID: "67b8b4b8-be77-4988-b2a6-5f2c5e21d874"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.164669 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.164710 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4dsq\" (UniqueName: \"kubernetes.io/projected/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-kube-api-access-s4dsq\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.199491 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67b8b4b8-be77-4988-b2a6-5f2c5e21d874" (UID: "67b8b4b8-be77-4988-b2a6-5f2c5e21d874"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.266196 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b8b4b8-be77-4988-b2a6-5f2c5e21d874-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.510254 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.644836 4766 generic.go:334] "Generic (PLEG): container finished" podID="c33cb8e9-0109-4f61-9441-c3a99463eaf4" containerID="7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf" exitCode=0 Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.644884 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.644902 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c33cb8e9-0109-4f61-9441-c3a99463eaf4","Type":"ContainerDied","Data":"7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf"} Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.644930 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c33cb8e9-0109-4f61-9441-c3a99463eaf4","Type":"ContainerDied","Data":"72128b0d092c24b31e8ba51926bcdf716aeb680a12b7f9df66dfba98aca7553f"} Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.644947 4766 scope.go:117] "RemoveContainer" containerID="7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.650180 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"67b8b4b8-be77-4988-b2a6-5f2c5e21d874","Type":"ContainerDied","Data":"1b16c70e6f9d83c9f51d7b9b327567d9c70c202cbb1d4799b849279a2eb5c468"} Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.650254 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.674028 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-combined-ca-bundle\") pod \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.676530 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-config-data\") pod \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.676641 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nl4x\" (UniqueName: \"kubernetes.io/projected/c33cb8e9-0109-4f61-9441-c3a99463eaf4-kube-api-access-7nl4x\") pod \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\" (UID: \"c33cb8e9-0109-4f61-9441-c3a99463eaf4\") " Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.685115 4766 scope.go:117] "RemoveContainer" containerID="7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf" Dec 09 05:43:31 crc kubenswrapper[4766]: E1209 05:43:31.686144 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf\": container with ID starting with 7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf not found: ID does not exist" containerID="7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.686225 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf"} err="failed to get container status \"7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf\": rpc error: code = NotFound desc = could not find container \"7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf\": container with ID starting with 7688c80d80e9fc45de3aa772aa7ea775b3acfda88952aeef47816be3992dd1cf not found: ID does not exist" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.686254 4766 scope.go:117] "RemoveContainer" containerID="46bac6e0959b2e7c1cb812308bc7f3c437527d4b8e8da8360281f1fa3c9e6784" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.696750 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33cb8e9-0109-4f61-9441-c3a99463eaf4-kube-api-access-7nl4x" (OuterVolumeSpecName: "kube-api-access-7nl4x") pod "c33cb8e9-0109-4f61-9441-c3a99463eaf4" (UID: "c33cb8e9-0109-4f61-9441-c3a99463eaf4"). InnerVolumeSpecName "kube-api-access-7nl4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.702983 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.722256 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.736176 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c33cb8e9-0109-4f61-9441-c3a99463eaf4" (UID: "c33cb8e9-0109-4f61-9441-c3a99463eaf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.736767 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.736874 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-config-data" (OuterVolumeSpecName: "config-data") pod "c33cb8e9-0109-4f61-9441-c3a99463eaf4" (UID: "c33cb8e9-0109-4f61-9441-c3a99463eaf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:31 crc kubenswrapper[4766]: E1209 05:43:31.737273 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04eb3ac-c0d3-470a-a741-3e8369270d60" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.737290 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04eb3ac-c0d3-470a-a741-3e8369270d60" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 09 05:43:31 crc kubenswrapper[4766]: E1209 05:43:31.737311 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b8b4b8-be77-4988-b2a6-5f2c5e21d874" containerName="nova-cell1-conductor-conductor" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.737317 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b8b4b8-be77-4988-b2a6-5f2c5e21d874" containerName="nova-cell1-conductor-conductor" Dec 09 05:43:31 crc kubenswrapper[4766]: E1209 05:43:31.737344 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33cb8e9-0109-4f61-9441-c3a99463eaf4" containerName="nova-scheduler-scheduler" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.737352 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33cb8e9-0109-4f61-9441-c3a99463eaf4" containerName="nova-scheduler-scheduler" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.737532 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b8b4b8-be77-4988-b2a6-5f2c5e21d874" containerName="nova-cell1-conductor-conductor" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.737561 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33cb8e9-0109-4f61-9441-c3a99463eaf4" containerName="nova-scheduler-scheduler" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.737581 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04eb3ac-c0d3-470a-a741-3e8369270d60" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.738319 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.740798 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.753039 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.778992 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714ab1b8-4018-410b-893f-a4acee38b1ca-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"714ab1b8-4018-410b-893f-a4acee38b1ca\") " pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.779225 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/714ab1b8-4018-410b-893f-a4acee38b1ca-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"714ab1b8-4018-410b-893f-a4acee38b1ca\") " pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.779287 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99hkr\" (UniqueName: \"kubernetes.io/projected/714ab1b8-4018-410b-893f-a4acee38b1ca-kube-api-access-99hkr\") pod \"nova-cell1-conductor-0\" (UID: \"714ab1b8-4018-410b-893f-a4acee38b1ca\") " pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.779403 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.779424 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nl4x\" (UniqueName: \"kubernetes.io/projected/c33cb8e9-0109-4f61-9441-c3a99463eaf4-kube-api-access-7nl4x\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.779438 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c33cb8e9-0109-4f61-9441-c3a99463eaf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.881103 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/714ab1b8-4018-410b-893f-a4acee38b1ca-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"714ab1b8-4018-410b-893f-a4acee38b1ca\") " pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.882003 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99hkr\" (UniqueName: \"kubernetes.io/projected/714ab1b8-4018-410b-893f-a4acee38b1ca-kube-api-access-99hkr\") pod \"nova-cell1-conductor-0\" (UID: \"714ab1b8-4018-410b-893f-a4acee38b1ca\") " pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.882171 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714ab1b8-4018-410b-893f-a4acee38b1ca-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"714ab1b8-4018-410b-893f-a4acee38b1ca\") " pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.885282 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714ab1b8-4018-410b-893f-a4acee38b1ca-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"714ab1b8-4018-410b-893f-a4acee38b1ca\") " pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.885387 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/714ab1b8-4018-410b-893f-a4acee38b1ca-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"714ab1b8-4018-410b-893f-a4acee38b1ca\") " pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.898897 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99hkr\" (UniqueName: \"kubernetes.io/projected/714ab1b8-4018-410b-893f-a4acee38b1ca-kube-api-access-99hkr\") pod \"nova-cell1-conductor-0\" (UID: \"714ab1b8-4018-410b-893f-a4acee38b1ca\") " pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.980533 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 05:43:31 crc kubenswrapper[4766]: I1209 05:43:31.992810 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.007545 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.009396 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.011296 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.025438 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.063558 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.086183 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z96h5\" (UniqueName: \"kubernetes.io/projected/e73e246e-ca55-4721-8836-da091b7cb32d-kube-api-access-z96h5\") pod \"nova-scheduler-0\" (UID: \"e73e246e-ca55-4721-8836-da091b7cb32d\") " pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.086928 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73e246e-ca55-4721-8836-da091b7cb32d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e73e246e-ca55-4721-8836-da091b7cb32d\") " pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.086984 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73e246e-ca55-4721-8836-da091b7cb32d-config-data\") pod \"nova-scheduler-0\" (UID: \"e73e246e-ca55-4721-8836-da091b7cb32d\") " pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.188685 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73e246e-ca55-4721-8836-da091b7cb32d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e73e246e-ca55-4721-8836-da091b7cb32d\") " pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.188724 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73e246e-ca55-4721-8836-da091b7cb32d-config-data\") pod \"nova-scheduler-0\" (UID: \"e73e246e-ca55-4721-8836-da091b7cb32d\") " pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.188798 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z96h5\" (UniqueName: \"kubernetes.io/projected/e73e246e-ca55-4721-8836-da091b7cb32d-kube-api-access-z96h5\") pod \"nova-scheduler-0\" (UID: \"e73e246e-ca55-4721-8836-da091b7cb32d\") " pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.193837 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73e246e-ca55-4721-8836-da091b7cb32d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e73e246e-ca55-4721-8836-da091b7cb32d\") " pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.194432 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73e246e-ca55-4721-8836-da091b7cb32d-config-data\") pod \"nova-scheduler-0\" (UID: \"e73e246e-ca55-4721-8836-da091b7cb32d\") " pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.205249 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z96h5\" (UniqueName: \"kubernetes.io/projected/e73e246e-ca55-4721-8836-da091b7cb32d-kube-api-access-z96h5\") pod \"nova-scheduler-0\" (UID: \"e73e246e-ca55-4721-8836-da091b7cb32d\") " pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.327770 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.523655 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.666840 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"714ab1b8-4018-410b-893f-a4acee38b1ca","Type":"ContainerStarted","Data":"bfc5e666b41eaee005c104c3ee9f253273757118ab8bb50d666671daa1701da2"} Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.827747 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 09 05:43:32 crc kubenswrapper[4766]: W1209 05:43:32.828203 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode73e246e_ca55_4721_8836_da091b7cb32d.slice/crio-be3e61d60a3f8b74b1d060ec029ada02873e17124ca5eb91a470c69fd54183f2 WatchSource:0}: Error finding container be3e61d60a3f8b74b1d060ec029ada02873e17124ca5eb91a470c69fd54183f2: Status 404 returned error can't find the container with id be3e61d60a3f8b74b1d060ec029ada02873e17124ca5eb91a470c69fd54183f2 Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.850979 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b8b4b8-be77-4988-b2a6-5f2c5e21d874" path="/var/lib/kubelet/pods/67b8b4b8-be77-4988-b2a6-5f2c5e21d874/volumes" Dec 09 05:43:32 crc kubenswrapper[4766]: I1209 05:43:32.852013 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33cb8e9-0109-4f61-9441-c3a99463eaf4" path="/var/lib/kubelet/pods/c33cb8e9-0109-4f61-9441-c3a99463eaf4/volumes" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.617772 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.633167 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-config-data\") pod \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.633328 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-logs\") pod \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.633450 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7nnb\" (UniqueName: \"kubernetes.io/projected/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-kube-api-access-j7nnb\") pod \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.633483 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-combined-ca-bundle\") pod \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\" (UID: \"30ee8f60-34a9-4f4f-8deb-324d8bfd2405\") " Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.635184 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-logs" (OuterVolumeSpecName: "logs") pod "30ee8f60-34a9-4f4f-8deb-324d8bfd2405" (UID: "30ee8f60-34a9-4f4f-8deb-324d8bfd2405"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.644401 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-kube-api-access-j7nnb" (OuterVolumeSpecName: "kube-api-access-j7nnb") pod "30ee8f60-34a9-4f4f-8deb-324d8bfd2405" (UID: "30ee8f60-34a9-4f4f-8deb-324d8bfd2405"). InnerVolumeSpecName "kube-api-access-j7nnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.692758 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"714ab1b8-4018-410b-893f-a4acee38b1ca","Type":"ContainerStarted","Data":"f2cbf32a23bc81d97dfad3c5f7c6951be2b08158f492f9846ece49147c063848"} Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.694007 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.696367 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e73e246e-ca55-4721-8836-da091b7cb32d","Type":"ContainerStarted","Data":"ba0c26a16ba7d258be000bf815825ad2344e264834f4c7c1b368b1332bfee267"} Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.696399 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e73e246e-ca55-4721-8836-da091b7cb32d","Type":"ContainerStarted","Data":"be3e61d60a3f8b74b1d060ec029ada02873e17124ca5eb91a470c69fd54183f2"} Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.704043 4766 generic.go:334] "Generic (PLEG): container finished" podID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerID="3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554" exitCode=0 Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.704117 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30ee8f60-34a9-4f4f-8deb-324d8bfd2405","Type":"ContainerDied","Data":"3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554"} Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.704146 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30ee8f60-34a9-4f4f-8deb-324d8bfd2405","Type":"ContainerDied","Data":"ca200d6b5334c4ccaa70734e29c9da3433ababf42cd99e6eafcc49b771a581d6"} Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.704167 4766 scope.go:117] "RemoveContainer" containerID="3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.704331 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.709998 4766 generic.go:334] "Generic (PLEG): container finished" podID="63d5b763-8225-4cac-bdde-0002c06ed154" containerID="4cb91fd0afc5b488372a9a94bb30813d7d6976c95137e2089670348d94b43133" exitCode=0 Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.710042 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63d5b763-8225-4cac-bdde-0002c06ed154","Type":"ContainerDied","Data":"4cb91fd0afc5b488372a9a94bb30813d7d6976c95137e2089670348d94b43133"} Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.734874 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.734856531 podStartE2EDuration="2.734856531s" podCreationTimestamp="2025-12-09 05:43:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 05:43:33.715413235 +0000 UTC m=+9095.424718661" watchObservedRunningTime="2025-12-09 05:43:33.734856531 +0000 UTC m=+9095.444161957" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.735471 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-logs\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.735498 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7nnb\" (UniqueName: \"kubernetes.io/projected/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-kube-api-access-j7nnb\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:33 crc kubenswrapper[4766]: I1209 05:43:33.742438 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.742417406 podStartE2EDuration="2.742417406s" podCreationTimestamp="2025-12-09 05:43:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 05:43:33.733506675 +0000 UTC m=+9095.442812111" watchObservedRunningTime="2025-12-09 05:43:33.742417406 +0000 UTC m=+9095.451722832" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.286962 4766 scope.go:117] "RemoveContainer" containerID="81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.299627 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-config-data" (OuterVolumeSpecName: "config-data") pod "30ee8f60-34a9-4f4f-8deb-324d8bfd2405" (UID: "30ee8f60-34a9-4f4f-8deb-324d8bfd2405"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.332369 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30ee8f60-34a9-4f4f-8deb-324d8bfd2405" (UID: "30ee8f60-34a9-4f4f-8deb-324d8bfd2405"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.348914 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.348956 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee8f60-34a9-4f4f-8deb-324d8bfd2405-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.427830 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.439048 4766 scope.go:117] "RemoveContainer" containerID="3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554" Dec 09 05:43:34 crc kubenswrapper[4766]: E1209 05:43:34.439485 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554\": container with ID starting with 3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554 not found: ID does not exist" containerID="3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.439525 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554"} err="failed to get container status \"3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554\": rpc error: code = NotFound desc = could not find container \"3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554\": container with ID starting with 3491c81b1be52362d6c5797cd98cd44525999e711d1047564b60cbe0ca8b1554 not found: ID does not exist" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.439549 4766 scope.go:117] "RemoveContainer" containerID="81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9" Dec 09 05:43:34 crc kubenswrapper[4766]: E1209 05:43:34.439764 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9\": container with ID starting with 81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9 not found: ID does not exist" containerID="81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.439784 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9"} err="failed to get container status \"81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9\": rpc error: code = NotFound desc = could not find container \"81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9\": container with ID starting with 81f8c3d91dd1d02560ac93d4b5bd067eabf8053a132dd722f23d77a0757ab7f9 not found: ID does not exist" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.463064 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d5b763-8225-4cac-bdde-0002c06ed154-logs\") pod \"63d5b763-8225-4cac-bdde-0002c06ed154\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.463133 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-config-data\") pod \"63d5b763-8225-4cac-bdde-0002c06ed154\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.463172 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-combined-ca-bundle\") pod \"63d5b763-8225-4cac-bdde-0002c06ed154\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.463199 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stnmn\" (UniqueName: \"kubernetes.io/projected/63d5b763-8225-4cac-bdde-0002c06ed154-kube-api-access-stnmn\") pod \"63d5b763-8225-4cac-bdde-0002c06ed154\" (UID: \"63d5b763-8225-4cac-bdde-0002c06ed154\") " Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.474751 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d5b763-8225-4cac-bdde-0002c06ed154-kube-api-access-stnmn" (OuterVolumeSpecName: "kube-api-access-stnmn") pod "63d5b763-8225-4cac-bdde-0002c06ed154" (UID: "63d5b763-8225-4cac-bdde-0002c06ed154"). InnerVolumeSpecName "kube-api-access-stnmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.479637 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d5b763-8225-4cac-bdde-0002c06ed154-logs" (OuterVolumeSpecName: "logs") pod "63d5b763-8225-4cac-bdde-0002c06ed154" (UID: "63d5b763-8225-4cac-bdde-0002c06ed154"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.519338 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-config-data" (OuterVolumeSpecName: "config-data") pod "63d5b763-8225-4cac-bdde-0002c06ed154" (UID: "63d5b763-8225-4cac-bdde-0002c06ed154"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.545550 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63d5b763-8225-4cac-bdde-0002c06ed154" (UID: "63d5b763-8225-4cac-bdde-0002c06ed154"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.564504 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stnmn\" (UniqueName: \"kubernetes.io/projected/63d5b763-8225-4cac-bdde-0002c06ed154-kube-api-access-stnmn\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.564542 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d5b763-8225-4cac-bdde-0002c06ed154-logs\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.564551 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.564562 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d5b763-8225-4cac-bdde-0002c06ed154-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.644388 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.663903 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.666613 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-combined-ca-bundle\") pod \"0fd44249-b949-4f58-9973-01c7d3494dcc\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.666770 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbgjq\" (UniqueName: \"kubernetes.io/projected/0fd44249-b949-4f58-9973-01c7d3494dcc-kube-api-access-jbgjq\") pod \"0fd44249-b949-4f58-9973-01c7d3494dcc\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.666845 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-config-data\") pod \"0fd44249-b949-4f58-9973-01c7d3494dcc\" (UID: \"0fd44249-b949-4f58-9973-01c7d3494dcc\") " Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.671688 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.681024 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd44249-b949-4f58-9973-01c7d3494dcc-kube-api-access-jbgjq" (OuterVolumeSpecName: "kube-api-access-jbgjq") pod "0fd44249-b949-4f58-9973-01c7d3494dcc" (UID: "0fd44249-b949-4f58-9973-01c7d3494dcc"). InnerVolumeSpecName "kube-api-access-jbgjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.694278 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 09 05:43:34 crc kubenswrapper[4766]: E1209 05:43:34.694840 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-log" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.694857 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-log" Dec 09 05:43:34 crc kubenswrapper[4766]: E1209 05:43:34.694874 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd44249-b949-4f58-9973-01c7d3494dcc" containerName="nova-cell0-conductor-conductor" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.694880 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd44249-b949-4f58-9973-01c7d3494dcc" containerName="nova-cell0-conductor-conductor" Dec 09 05:43:34 crc kubenswrapper[4766]: E1209 05:43:34.694896 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-metadata" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.694903 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-metadata" Dec 09 05:43:34 crc kubenswrapper[4766]: E1209 05:43:34.694913 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-api" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.694921 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-api" Dec 09 05:43:34 crc kubenswrapper[4766]: E1209 05:43:34.694955 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-log" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.694961 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-log" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.695165 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-log" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.695179 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-metadata" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.695202 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd44249-b949-4f58-9973-01c7d3494dcc" containerName="nova-cell0-conductor-conductor" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.695231 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-log" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.695241 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" containerName="nova-api-api" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.696439 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.711739 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.730431 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-config-data" (OuterVolumeSpecName: "config-data") pod "0fd44249-b949-4f58-9973-01c7d3494dcc" (UID: "0fd44249-b949-4f58-9973-01c7d3494dcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.738452 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.738617 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63d5b763-8225-4cac-bdde-0002c06ed154","Type":"ContainerDied","Data":"f8dc682b4a3d2cbe26dd7963a2967f00d3d665120fd21a4ca7306d3f594f1a22"} Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.738666 4766 scope.go:117] "RemoveContainer" containerID="4cb91fd0afc5b488372a9a94bb30813d7d6976c95137e2089670348d94b43133" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.743898 4766 generic.go:334] "Generic (PLEG): container finished" podID="0fd44249-b949-4f58-9973-01c7d3494dcc" containerID="97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c" exitCode=0 Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.744017 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.744017 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0fd44249-b949-4f58-9973-01c7d3494dcc","Type":"ContainerDied","Data":"97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c"} Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.744663 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0fd44249-b949-4f58-9973-01c7d3494dcc","Type":"ContainerDied","Data":"40db9e28de0ca25e99fc5720b0f379e953b4401069ecd31420b52e2cfa7e5af1"} Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.769451 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.771500 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxgsd\" (UniqueName: \"kubernetes.io/projected/23d93ca5-3499-4896-9d83-4641813d0844-kube-api-access-sxgsd\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.771579 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d93ca5-3499-4896-9d83-4641813d0844-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.771684 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d93ca5-3499-4896-9d83-4641813d0844-config-data\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.771734 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23d93ca5-3499-4896-9d83-4641813d0844-logs\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.771812 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbgjq\" (UniqueName: \"kubernetes.io/projected/0fd44249-b949-4f58-9973-01c7d3494dcc-kube-api-access-jbgjq\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.771828 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.776417 4766 scope.go:117] "RemoveContainer" containerID="26b07fb1176840683bbf295eaff4294fda9b24682a4f02345af89cf54f1d95b8" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.776715 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fd44249-b949-4f58-9973-01c7d3494dcc" (UID: "0fd44249-b949-4f58-9973-01c7d3494dcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.814485 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.831810 4766 scope.go:117] "RemoveContainer" containerID="97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.837352 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.856761 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ee8f60-34a9-4f4f-8deb-324d8bfd2405" path="/var/lib/kubelet/pods/30ee8f60-34a9-4f4f-8deb-324d8bfd2405/volumes" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.857408 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" path="/var/lib/kubelet/pods/63d5b763-8225-4cac-bdde-0002c06ed154/volumes" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.858608 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.860368 4766 scope.go:117] "RemoveContainer" containerID="97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.860691 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.865250 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 05:43:34 crc kubenswrapper[4766]: E1209 05:43:34.865259 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c\": container with ID starting with 97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c not found: ID does not exist" containerID="97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.865346 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c"} err="failed to get container status \"97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c\": rpc error: code = NotFound desc = could not find container \"97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c\": container with ID starting with 97a83504d3e31e9b1919bbcf98640630e26b08e9e78febcd58900b92f3cc3f5c not found: ID does not exist" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.873928 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d93ca5-3499-4896-9d83-4641813d0844-config-data\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.874018 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23d93ca5-3499-4896-9d83-4641813d0844-logs\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.874190 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxgsd\" (UniqueName: \"kubernetes.io/projected/23d93ca5-3499-4896-9d83-4641813d0844-kube-api-access-sxgsd\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.874265 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37589400-9a3a-4acc-baae-2ff3d1f09e30-logs\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.874302 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37589400-9a3a-4acc-baae-2ff3d1f09e30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.874338 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d93ca5-3499-4896-9d83-4641813d0844-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.874371 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37589400-9a3a-4acc-baae-2ff3d1f09e30-config-data\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.874415 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqz6w\" (UniqueName: \"kubernetes.io/projected/37589400-9a3a-4acc-baae-2ff3d1f09e30-kube-api-access-cqz6w\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.874488 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd44249-b949-4f58-9973-01c7d3494dcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.874857 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23d93ca5-3499-4896-9d83-4641813d0844-logs\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.875138 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.885893 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d93ca5-3499-4896-9d83-4641813d0844-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.886955 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d93ca5-3499-4896-9d83-4641813d0844-config-data\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.895661 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxgsd\" (UniqueName: \"kubernetes.io/projected/23d93ca5-3499-4896-9d83-4641813d0844-kube-api-access-sxgsd\") pod \"nova-api-0\" (UID: \"23d93ca5-3499-4896-9d83-4641813d0844\") " pod="openstack/nova-api-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.976231 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37589400-9a3a-4acc-baae-2ff3d1f09e30-config-data\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.976303 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqz6w\" (UniqueName: \"kubernetes.io/projected/37589400-9a3a-4acc-baae-2ff3d1f09e30-kube-api-access-cqz6w\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.976656 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37589400-9a3a-4acc-baae-2ff3d1f09e30-logs\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.976783 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37589400-9a3a-4acc-baae-2ff3d1f09e30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.980353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37589400-9a3a-4acc-baae-2ff3d1f09e30-logs\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.980900 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37589400-9a3a-4acc-baae-2ff3d1f09e30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:34 crc kubenswrapper[4766]: I1209 05:43:34.981043 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37589400-9a3a-4acc-baae-2ff3d1f09e30-config-data\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.000480 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqz6w\" (UniqueName: \"kubernetes.io/projected/37589400-9a3a-4acc-baae-2ff3d1f09e30-kube-api-access-cqz6w\") pod \"nova-metadata-0\" (UID: \"37589400-9a3a-4acc-baae-2ff3d1f09e30\") " pod="openstack/nova-metadata-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.040949 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.080379 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.102611 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.129274 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.131087 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.133730 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.144354 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.191443 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.293978 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a70e4b72-a141-4697-83de-8c3fbf6f9c58-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a70e4b72-a141-4697-83de-8c3fbf6f9c58\") " pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.294154 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rzv8\" (UniqueName: \"kubernetes.io/projected/a70e4b72-a141-4697-83de-8c3fbf6f9c58-kube-api-access-7rzv8\") pod \"nova-cell0-conductor-0\" (UID: \"a70e4b72-a141-4697-83de-8c3fbf6f9c58\") " pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.294184 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a70e4b72-a141-4697-83de-8c3fbf6f9c58-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a70e4b72-a141-4697-83de-8c3fbf6f9c58\") " pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.396529 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rzv8\" (UniqueName: \"kubernetes.io/projected/a70e4b72-a141-4697-83de-8c3fbf6f9c58-kube-api-access-7rzv8\") pod \"nova-cell0-conductor-0\" (UID: \"a70e4b72-a141-4697-83de-8c3fbf6f9c58\") " pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.396587 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a70e4b72-a141-4697-83de-8c3fbf6f9c58-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a70e4b72-a141-4697-83de-8c3fbf6f9c58\") " pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.396699 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a70e4b72-a141-4697-83de-8c3fbf6f9c58-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a70e4b72-a141-4697-83de-8c3fbf6f9c58\") " pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.402397 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a70e4b72-a141-4697-83de-8c3fbf6f9c58-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a70e4b72-a141-4697-83de-8c3fbf6f9c58\") " pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.416604 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a70e4b72-a141-4697-83de-8c3fbf6f9c58-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a70e4b72-a141-4697-83de-8c3fbf6f9c58\") " pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.429581 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rzv8\" (UniqueName: \"kubernetes.io/projected/a70e4b72-a141-4697-83de-8c3fbf6f9c58-kube-api-access-7rzv8\") pod \"nova-cell0-conductor-0\" (UID: \"a70e4b72-a141-4697-83de-8c3fbf6f9c58\") " pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.543711 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.631080 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 09 05:43:35 crc kubenswrapper[4766]: W1209 05:43:35.659344 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23d93ca5_3499_4896_9d83_4641813d0844.slice/crio-f53044408264117b4b1245720bbcb25d25b2acc9c97c7b2eb56fca3fd0b15e3d WatchSource:0}: Error finding container f53044408264117b4b1245720bbcb25d25b2acc9c97c7b2eb56fca3fd0b15e3d: Status 404 returned error can't find the container with id f53044408264117b4b1245720bbcb25d25b2acc9c97c7b2eb56fca3fd0b15e3d Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.773845 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23d93ca5-3499-4896-9d83-4641813d0844","Type":"ContainerStarted","Data":"f53044408264117b4b1245720bbcb25d25b2acc9c97c7b2eb56fca3fd0b15e3d"} Dec 09 05:43:35 crc kubenswrapper[4766]: I1209 05:43:35.809729 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.093027 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 09 05:43:36 crc kubenswrapper[4766]: W1209 05:43:36.265875 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37589400_9a3a_4acc_baae_2ff3d1f09e30.slice/crio-969c1a21d481cacdac4b07a9be4def14248d63c03368346cb5a28fd8cfe7b6ed WatchSource:0}: Error finding container 969c1a21d481cacdac4b07a9be4def14248d63c03368346cb5a28fd8cfe7b6ed: Status 404 returned error can't find the container with id 969c1a21d481cacdac4b07a9be4def14248d63c03368346cb5a28fd8cfe7b6ed Dec 09 05:43:36 crc kubenswrapper[4766]: W1209 05:43:36.279522 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda70e4b72_a141_4697_83de_8c3fbf6f9c58.slice/crio-08ce079421b1fc835a0b82e59d487c2c8f1af5d88afb7b88916b0888065fb478 WatchSource:0}: Error finding container 08ce079421b1fc835a0b82e59d487c2c8f1af5d88afb7b88916b0888065fb478: Status 404 returned error can't find the container with id 08ce079421b1fc835a0b82e59d487c2c8f1af5d88afb7b88916b0888065fb478 Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.786013 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23d93ca5-3499-4896-9d83-4641813d0844","Type":"ContainerStarted","Data":"dd4b341e4107b8be604341aea11b861847322ae74e795a15040f08847e922541"} Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.786371 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23d93ca5-3499-4896-9d83-4641813d0844","Type":"ContainerStarted","Data":"9d1372ca23d7d6c0e8466d3191d98e949f50a5f6902b09e1f677999e47311dfc"} Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.791747 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37589400-9a3a-4acc-baae-2ff3d1f09e30","Type":"ContainerStarted","Data":"541bd4725eb715d10fa18431833e57b9c9592999a7339c09168fa1e522b7fc8a"} Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.791802 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37589400-9a3a-4acc-baae-2ff3d1f09e30","Type":"ContainerStarted","Data":"969c1a21d481cacdac4b07a9be4def14248d63c03368346cb5a28fd8cfe7b6ed"} Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.794423 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a70e4b72-a141-4697-83de-8c3fbf6f9c58","Type":"ContainerStarted","Data":"79364f867e7b38fd6367e48964ac58f89ff731eb2786d95df77b3e7ce4f15476"} Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.794474 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a70e4b72-a141-4697-83de-8c3fbf6f9c58","Type":"ContainerStarted","Data":"08ce079421b1fc835a0b82e59d487c2c8f1af5d88afb7b88916b0888065fb478"} Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.794602 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.805563 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.805541762 podStartE2EDuration="2.805541762s" podCreationTimestamp="2025-12-09 05:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 05:43:36.801863712 +0000 UTC m=+9098.511169138" watchObservedRunningTime="2025-12-09 05:43:36.805541762 +0000 UTC m=+9098.514847188" Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.834735 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.834716102 podStartE2EDuration="1.834716102s" podCreationTimestamp="2025-12-09 05:43:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 05:43:36.821109203 +0000 UTC m=+9098.530414629" watchObservedRunningTime="2025-12-09 05:43:36.834716102 +0000 UTC m=+9098.544021528" Dec 09 05:43:36 crc kubenswrapper[4766]: I1209 05:43:36.857164 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd44249-b949-4f58-9973-01c7d3494dcc" path="/var/lib/kubelet/pods/0fd44249-b949-4f58-9973-01c7d3494dcc/volumes" Dec 09 05:43:37 crc kubenswrapper[4766]: I1209 05:43:37.093191 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 09 05:43:37 crc kubenswrapper[4766]: I1209 05:43:37.327968 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 09 05:43:39 crc kubenswrapper[4766]: I1209 05:43:39.193424 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": dial tcp 10.217.1.81:8775: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 09 05:43:39 crc kubenswrapper[4766]: I1209 05:43:39.193899 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="63d5b763-8225-4cac-bdde-0002c06ed154" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": dial tcp 10.217.1.81:8775: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 09 05:43:39 crc kubenswrapper[4766]: I1209 05:43:39.316905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37589400-9a3a-4acc-baae-2ff3d1f09e30","Type":"ContainerStarted","Data":"570bab7c4197c3ef847c05f0ebacf8f21391201b22f757862be18e90b141be12"} Dec 09 05:43:39 crc kubenswrapper[4766]: I1209 05:43:39.416810 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=5.416769415 podStartE2EDuration="5.416769415s" podCreationTimestamp="2025-12-09 05:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 05:43:39.353799541 +0000 UTC m=+9101.063104967" watchObservedRunningTime="2025-12-09 05:43:39.416769415 +0000 UTC m=+9101.126074841" Dec 09 05:43:40 crc kubenswrapper[4766]: I1209 05:43:40.192428 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 05:43:40 crc kubenswrapper[4766]: I1209 05:43:40.192881 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 09 05:43:42 crc kubenswrapper[4766]: I1209 05:43:42.328319 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 09 05:43:42 crc kubenswrapper[4766]: I1209 05:43:42.358284 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 09 05:43:42 crc kubenswrapper[4766]: I1209 05:43:42.389469 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 09 05:43:45 crc kubenswrapper[4766]: I1209 05:43:45.042050 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 05:43:45 crc kubenswrapper[4766]: I1209 05:43:45.042400 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 09 05:43:45 crc kubenswrapper[4766]: I1209 05:43:45.192394 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 05:43:45 crc kubenswrapper[4766]: I1209 05:43:45.192464 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 09 05:43:45 crc kubenswrapper[4766]: I1209 05:43:45.577532 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 09 05:43:46 crc kubenswrapper[4766]: I1209 05:43:46.126462 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="23d93ca5-3499-4896-9d83-4641813d0844" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 05:43:46 crc kubenswrapper[4766]: I1209 05:43:46.126473 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="23d93ca5-3499-4896-9d83-4641813d0844" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 05:43:46 crc kubenswrapper[4766]: I1209 05:43:46.276390 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="37589400-9a3a-4acc-baae-2ff3d1f09e30" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.189:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 05:43:46 crc kubenswrapper[4766]: I1209 05:43:46.276928 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="37589400-9a3a-4acc-baae-2ff3d1f09e30" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.189:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.045719 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.048249 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.053378 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.070319 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.195410 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.195516 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.197776 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.200028 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.493861 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 09 05:43:55 crc kubenswrapper[4766]: I1209 05:43:55.503766 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.695820 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v"] Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.698464 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.702525 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.702553 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.702530 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.702687 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.702758 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.702689 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-d8s54" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.703335 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.707988 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v"] Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796166 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796231 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796258 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796460 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796620 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796692 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvqm4\" (UniqueName: \"kubernetes.io/projected/8683912f-3cfa-4505-abfc-49943b3965c7-kube-api-access-hvqm4\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796810 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796840 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796883 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.796921 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.797025 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.899707 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.899801 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.899842 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvqm4\" (UniqueName: \"kubernetes.io/projected/8683912f-3cfa-4505-abfc-49943b3965c7-kube-api-access-hvqm4\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.899914 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.899950 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.900007 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.900047 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.900079 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.900157 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.900191 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.900239 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.900939 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.903619 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.907808 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.907851 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.907860 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.908118 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.911526 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.911633 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.912959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.914669 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:56 crc kubenswrapper[4766]: I1209 05:43:56.921722 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvqm4\" (UniqueName: \"kubernetes.io/projected/8683912f-3cfa-4505-abfc-49943b3965c7-kube-api-access-hvqm4\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:57 crc kubenswrapper[4766]: I1209 05:43:57.048073 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:43:57 crc kubenswrapper[4766]: I1209 05:43:57.718650 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:43:57 crc kubenswrapper[4766]: I1209 05:43:57.719728 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v"] Dec 09 05:43:58 crc kubenswrapper[4766]: I1209 05:43:58.536811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" event={"ID":"8683912f-3cfa-4505-abfc-49943b3965c7","Type":"ContainerStarted","Data":"42a2c9e4e2128e130a953b5a2c2d13b7248ab006107899c72f5fe1dca94ca19e"} Dec 09 05:43:59 crc kubenswrapper[4766]: I1209 05:43:59.547708 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" event={"ID":"8683912f-3cfa-4505-abfc-49943b3965c7","Type":"ContainerStarted","Data":"ed2de21fd2b19470325e8545381e16cad275cc4dd73d64068e0b52f4e74c23fc"} Dec 09 05:43:59 crc kubenswrapper[4766]: I1209 05:43:59.583553 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" podStartSLOduration=3.405944484 podStartE2EDuration="3.583529842s" podCreationTimestamp="2025-12-09 05:43:56 +0000 UTC" firstStartedPulling="2025-12-09 05:43:57.718414335 +0000 UTC m=+9119.427719761" lastFinishedPulling="2025-12-09 05:43:57.895999693 +0000 UTC m=+9119.605305119" observedRunningTime="2025-12-09 05:43:59.56646138 +0000 UTC m=+9121.275766816" watchObservedRunningTime="2025-12-09 05:43:59.583529842 +0000 UTC m=+9121.292835278" Dec 09 05:44:07 crc kubenswrapper[4766]: I1209 05:44:07.316863 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:44:07 crc kubenswrapper[4766]: I1209 05:44:07.317499 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:44:37 crc kubenswrapper[4766]: I1209 05:44:37.316800 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:44:37 crc kubenswrapper[4766]: I1209 05:44:37.317244 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.156795 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qhpj8"] Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.160421 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.173493 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhpj8"] Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.184486 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-catalog-content\") pod \"community-operators-qhpj8\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.184588 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp6l6\" (UniqueName: \"kubernetes.io/projected/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-kube-api-access-dp6l6\") pod \"community-operators-qhpj8\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.184669 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-utilities\") pod \"community-operators-qhpj8\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.285636 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp6l6\" (UniqueName: \"kubernetes.io/projected/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-kube-api-access-dp6l6\") pod \"community-operators-qhpj8\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.285742 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-utilities\") pod \"community-operators-qhpj8\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.285825 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-catalog-content\") pod \"community-operators-qhpj8\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.286392 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-catalog-content\") pod \"community-operators-qhpj8\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.286392 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-utilities\") pod \"community-operators-qhpj8\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.305758 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp6l6\" (UniqueName: \"kubernetes.io/projected/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-kube-api-access-dp6l6\") pod \"community-operators-qhpj8\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:43 crc kubenswrapper[4766]: I1209 05:44:43.517873 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:44 crc kubenswrapper[4766]: I1209 05:44:44.038991 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhpj8"] Dec 09 05:44:44 crc kubenswrapper[4766]: I1209 05:44:44.120540 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhpj8" event={"ID":"486d7e63-e9a9-4ed7-ac44-77276d37fd2f","Type":"ContainerStarted","Data":"21dedb14990431b1a7f2b4d83261c2446be1206b48a686ce852e2c43eef8a87d"} Dec 09 05:44:45 crc kubenswrapper[4766]: I1209 05:44:45.139100 4766 generic.go:334] "Generic (PLEG): container finished" podID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerID="7e3c3b09fa1e20df9513867d5f8f5ce9c4c9d406fbe6de5a812316224b1a1e1f" exitCode=0 Dec 09 05:44:45 crc kubenswrapper[4766]: I1209 05:44:45.139576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhpj8" event={"ID":"486d7e63-e9a9-4ed7-ac44-77276d37fd2f","Type":"ContainerDied","Data":"7e3c3b09fa1e20df9513867d5f8f5ce9c4c9d406fbe6de5a812316224b1a1e1f"} Dec 09 05:44:46 crc kubenswrapper[4766]: I1209 05:44:46.150931 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhpj8" event={"ID":"486d7e63-e9a9-4ed7-ac44-77276d37fd2f","Type":"ContainerStarted","Data":"ef3e7a4198b51179020ac1aa117b21bcda0a7dd72fdacadb300d6219f91157c8"} Dec 09 05:44:47 crc kubenswrapper[4766]: I1209 05:44:47.168846 4766 generic.go:334] "Generic (PLEG): container finished" podID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerID="ef3e7a4198b51179020ac1aa117b21bcda0a7dd72fdacadb300d6219f91157c8" exitCode=0 Dec 09 05:44:47 crc kubenswrapper[4766]: I1209 05:44:47.168962 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhpj8" event={"ID":"486d7e63-e9a9-4ed7-ac44-77276d37fd2f","Type":"ContainerDied","Data":"ef3e7a4198b51179020ac1aa117b21bcda0a7dd72fdacadb300d6219f91157c8"} Dec 09 05:44:49 crc kubenswrapper[4766]: I1209 05:44:49.186951 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhpj8" event={"ID":"486d7e63-e9a9-4ed7-ac44-77276d37fd2f","Type":"ContainerStarted","Data":"740b7d6688d980c1b07d17d2abee08e68c3f329425dd3e7faf68afb4d2ff645b"} Dec 09 05:44:49 crc kubenswrapper[4766]: I1209 05:44:49.211982 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qhpj8" podStartSLOduration=3.774694766 podStartE2EDuration="6.211961541s" podCreationTimestamp="2025-12-09 05:44:43 +0000 UTC" firstStartedPulling="2025-12-09 05:44:45.144056926 +0000 UTC m=+9166.853362352" lastFinishedPulling="2025-12-09 05:44:47.581323701 +0000 UTC m=+9169.290629127" observedRunningTime="2025-12-09 05:44:49.200512311 +0000 UTC m=+9170.909817737" watchObservedRunningTime="2025-12-09 05:44:49.211961541 +0000 UTC m=+9170.921266967" Dec 09 05:44:53 crc kubenswrapper[4766]: I1209 05:44:53.518492 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:53 crc kubenswrapper[4766]: I1209 05:44:53.518845 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:53 crc kubenswrapper[4766]: I1209 05:44:53.593131 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:54 crc kubenswrapper[4766]: I1209 05:44:54.314316 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:54 crc kubenswrapper[4766]: I1209 05:44:54.374015 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhpj8"] Dec 09 05:44:56 crc kubenswrapper[4766]: I1209 05:44:56.287161 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qhpj8" podUID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerName="registry-server" containerID="cri-o://740b7d6688d980c1b07d17d2abee08e68c3f329425dd3e7faf68afb4d2ff645b" gracePeriod=2 Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.305096 4766 generic.go:334] "Generic (PLEG): container finished" podID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerID="740b7d6688d980c1b07d17d2abee08e68c3f329425dd3e7faf68afb4d2ff645b" exitCode=0 Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.305170 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhpj8" event={"ID":"486d7e63-e9a9-4ed7-ac44-77276d37fd2f","Type":"ContainerDied","Data":"740b7d6688d980c1b07d17d2abee08e68c3f329425dd3e7faf68afb4d2ff645b"} Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.305675 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhpj8" event={"ID":"486d7e63-e9a9-4ed7-ac44-77276d37fd2f","Type":"ContainerDied","Data":"21dedb14990431b1a7f2b4d83261c2446be1206b48a686ce852e2c43eef8a87d"} Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.305710 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21dedb14990431b1a7f2b4d83261c2446be1206b48a686ce852e2c43eef8a87d" Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.337860 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.439527 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-utilities\") pod \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.439769 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp6l6\" (UniqueName: \"kubernetes.io/projected/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-kube-api-access-dp6l6\") pod \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.439815 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-catalog-content\") pod \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\" (UID: \"486d7e63-e9a9-4ed7-ac44-77276d37fd2f\") " Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.440466 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-utilities" (OuterVolumeSpecName: "utilities") pod "486d7e63-e9a9-4ed7-ac44-77276d37fd2f" (UID: "486d7e63-e9a9-4ed7-ac44-77276d37fd2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.446156 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-kube-api-access-dp6l6" (OuterVolumeSpecName: "kube-api-access-dp6l6") pod "486d7e63-e9a9-4ed7-ac44-77276d37fd2f" (UID: "486d7e63-e9a9-4ed7-ac44-77276d37fd2f"). InnerVolumeSpecName "kube-api-access-dp6l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.490335 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "486d7e63-e9a9-4ed7-ac44-77276d37fd2f" (UID: "486d7e63-e9a9-4ed7-ac44-77276d37fd2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.542661 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp6l6\" (UniqueName: \"kubernetes.io/projected/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-kube-api-access-dp6l6\") on node \"crc\" DevicePath \"\"" Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.542710 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:44:57 crc kubenswrapper[4766]: I1209 05:44:57.542725 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/486d7e63-e9a9-4ed7-ac44-77276d37fd2f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:44:58 crc kubenswrapper[4766]: I1209 05:44:58.316993 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhpj8" Dec 09 05:44:58 crc kubenswrapper[4766]: I1209 05:44:58.361810 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhpj8"] Dec 09 05:44:58 crc kubenswrapper[4766]: I1209 05:44:58.378210 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qhpj8"] Dec 09 05:44:58 crc kubenswrapper[4766]: I1209 05:44:58.856654 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" path="/var/lib/kubelet/pods/486d7e63-e9a9-4ed7-ac44-77276d37fd2f/volumes" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.177129 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5"] Dec 09 05:45:00 crc kubenswrapper[4766]: E1209 05:45:00.177945 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerName="registry-server" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.177960 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerName="registry-server" Dec 09 05:45:00 crc kubenswrapper[4766]: E1209 05:45:00.177990 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerName="extract-content" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.177998 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerName="extract-content" Dec 09 05:45:00 crc kubenswrapper[4766]: E1209 05:45:00.178020 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerName="extract-utilities" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.178027 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerName="extract-utilities" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.178303 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="486d7e63-e9a9-4ed7-ac44-77276d37fd2f" containerName="registry-server" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.179074 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.182348 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.182984 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.200285 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5"] Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.317128 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-secret-volume\") pod \"collect-profiles-29420985-j4hj5\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.317320 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-config-volume\") pod \"collect-profiles-29420985-j4hj5\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.317768 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56lz\" (UniqueName: \"kubernetes.io/projected/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-kube-api-access-w56lz\") pod \"collect-profiles-29420985-j4hj5\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.419692 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-secret-volume\") pod \"collect-profiles-29420985-j4hj5\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.419747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-config-volume\") pod \"collect-profiles-29420985-j4hj5\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.419931 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w56lz\" (UniqueName: \"kubernetes.io/projected/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-kube-api-access-w56lz\") pod \"collect-profiles-29420985-j4hj5\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.422100 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-config-volume\") pod \"collect-profiles-29420985-j4hj5\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.455165 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-secret-volume\") pod \"collect-profiles-29420985-j4hj5\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.464524 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56lz\" (UniqueName: \"kubernetes.io/projected/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-kube-api-access-w56lz\") pod \"collect-profiles-29420985-j4hj5\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.508600 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:00 crc kubenswrapper[4766]: I1209 05:45:00.968619 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5"] Dec 09 05:45:01 crc kubenswrapper[4766]: I1209 05:45:01.347769 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" event={"ID":"0a61d3b4-df1b-4fc9-b471-b8173f01cac3","Type":"ContainerStarted","Data":"afdbe54ff8c2bb6a3a9f4e77a439d5c5e2f06b7be246f0db0e0ec8860fb32f78"} Dec 09 05:45:01 crc kubenswrapper[4766]: I1209 05:45:01.349204 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" event={"ID":"0a61d3b4-df1b-4fc9-b471-b8173f01cac3","Type":"ContainerStarted","Data":"ca7182f089f9c67865a4eef89aff66d1a7793d90dc0979a5e699543ddb6c5d4d"} Dec 09 05:45:01 crc kubenswrapper[4766]: I1209 05:45:01.370800 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" podStartSLOduration=1.3707795489999999 podStartE2EDuration="1.370779549s" podCreationTimestamp="2025-12-09 05:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 05:45:01.370592404 +0000 UTC m=+9183.079897820" watchObservedRunningTime="2025-12-09 05:45:01.370779549 +0000 UTC m=+9183.080084995" Dec 09 05:45:02 crc kubenswrapper[4766]: I1209 05:45:02.368080 4766 generic.go:334] "Generic (PLEG): container finished" podID="0a61d3b4-df1b-4fc9-b471-b8173f01cac3" containerID="afdbe54ff8c2bb6a3a9f4e77a439d5c5e2f06b7be246f0db0e0ec8860fb32f78" exitCode=0 Dec 09 05:45:02 crc kubenswrapper[4766]: I1209 05:45:02.368230 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" event={"ID":"0a61d3b4-df1b-4fc9-b471-b8173f01cac3","Type":"ContainerDied","Data":"afdbe54ff8c2bb6a3a9f4e77a439d5c5e2f06b7be246f0db0e0ec8860fb32f78"} Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.791479 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.890808 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-secret-volume\") pod \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.890916 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w56lz\" (UniqueName: \"kubernetes.io/projected/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-kube-api-access-w56lz\") pod \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.891144 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-config-volume\") pod \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\" (UID: \"0a61d3b4-df1b-4fc9-b471-b8173f01cac3\") " Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.891606 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-config-volume" (OuterVolumeSpecName: "config-volume") pod "0a61d3b4-df1b-4fc9-b471-b8173f01cac3" (UID: "0a61d3b4-df1b-4fc9-b471-b8173f01cac3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.892069 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.897583 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0a61d3b4-df1b-4fc9-b471-b8173f01cac3" (UID: "0a61d3b4-df1b-4fc9-b471-b8173f01cac3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.898695 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-kube-api-access-w56lz" (OuterVolumeSpecName: "kube-api-access-w56lz") pod "0a61d3b4-df1b-4fc9-b471-b8173f01cac3" (UID: "0a61d3b4-df1b-4fc9-b471-b8173f01cac3"). InnerVolumeSpecName "kube-api-access-w56lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.994591 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 05:45:03 crc kubenswrapper[4766]: I1209 05:45:03.994624 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w56lz\" (UniqueName: \"kubernetes.io/projected/0a61d3b4-df1b-4fc9-b471-b8173f01cac3-kube-api-access-w56lz\") on node \"crc\" DevicePath \"\"" Dec 09 05:45:04 crc kubenswrapper[4766]: I1209 05:45:04.389527 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" event={"ID":"0a61d3b4-df1b-4fc9-b471-b8173f01cac3","Type":"ContainerDied","Data":"ca7182f089f9c67865a4eef89aff66d1a7793d90dc0979a5e699543ddb6c5d4d"} Dec 09 05:45:04 crc kubenswrapper[4766]: I1209 05:45:04.389848 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca7182f089f9c67865a4eef89aff66d1a7793d90dc0979a5e699543ddb6c5d4d" Dec 09 05:45:04 crc kubenswrapper[4766]: I1209 05:45:04.389596 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420985-j4hj5" Dec 09 05:45:04 crc kubenswrapper[4766]: I1209 05:45:04.453809 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x"] Dec 09 05:45:04 crc kubenswrapper[4766]: I1209 05:45:04.464003 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420940-wt72x"] Dec 09 05:45:04 crc kubenswrapper[4766]: I1209 05:45:04.854578 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07051b6a-932b-405d-a242-6415f1b28e94" path="/var/lib/kubelet/pods/07051b6a-932b-405d-a242-6415f1b28e94/volumes" Dec 09 05:45:07 crc kubenswrapper[4766]: I1209 05:45:07.316069 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:45:07 crc kubenswrapper[4766]: I1209 05:45:07.316398 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:45:07 crc kubenswrapper[4766]: I1209 05:45:07.316447 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:45:07 crc kubenswrapper[4766]: I1209 05:45:07.317386 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:45:07 crc kubenswrapper[4766]: I1209 05:45:07.317454 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" gracePeriod=600 Dec 09 05:45:07 crc kubenswrapper[4766]: E1209 05:45:07.443739 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:45:08 crc kubenswrapper[4766]: I1209 05:45:08.434319 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" exitCode=0 Dec 09 05:45:08 crc kubenswrapper[4766]: I1209 05:45:08.434489 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a"} Dec 09 05:45:08 crc kubenswrapper[4766]: I1209 05:45:08.434792 4766 scope.go:117] "RemoveContainer" containerID="3173c583d13d2e08ce38c6221db09bafc0bee8b88892615645184ce7426e6265" Dec 09 05:45:08 crc kubenswrapper[4766]: I1209 05:45:08.435808 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:45:08 crc kubenswrapper[4766]: E1209 05:45:08.436255 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:45:19 crc kubenswrapper[4766]: I1209 05:45:19.841190 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:45:19 crc kubenswrapper[4766]: E1209 05:45:19.841974 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:45:27 crc kubenswrapper[4766]: I1209 05:45:27.651854 4766 scope.go:117] "RemoveContainer" containerID="984b6dc6c2b62272e6a701cb7b4b69bd6ae613f5a4475ff15a52c6430dd2fb92" Dec 09 05:45:30 crc kubenswrapper[4766]: I1209 05:45:30.839861 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:45:30 crc kubenswrapper[4766]: E1209 05:45:30.840807 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:45:45 crc kubenswrapper[4766]: I1209 05:45:45.840030 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:45:45 crc kubenswrapper[4766]: E1209 05:45:45.840773 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:45:58 crc kubenswrapper[4766]: I1209 05:45:58.846955 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:45:58 crc kubenswrapper[4766]: E1209 05:45:58.847625 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:46:09 crc kubenswrapper[4766]: I1209 05:46:09.840200 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:46:09 crc kubenswrapper[4766]: E1209 05:46:09.841178 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:46:23 crc kubenswrapper[4766]: I1209 05:46:23.840756 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:46:23 crc kubenswrapper[4766]: E1209 05:46:23.841576 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:46:34 crc kubenswrapper[4766]: I1209 05:46:34.839691 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:46:34 crc kubenswrapper[4766]: E1209 05:46:34.842630 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:46:46 crc kubenswrapper[4766]: I1209 05:46:46.839427 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:46:46 crc kubenswrapper[4766]: E1209 05:46:46.840682 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:47:00 crc kubenswrapper[4766]: I1209 05:47:00.840297 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:47:00 crc kubenswrapper[4766]: E1209 05:47:00.841628 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:47:15 crc kubenswrapper[4766]: I1209 05:47:15.838924 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:47:15 crc kubenswrapper[4766]: E1209 05:47:15.839647 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:47:23 crc kubenswrapper[4766]: I1209 05:47:23.379030 4766 generic.go:334] "Generic (PLEG): container finished" podID="8683912f-3cfa-4505-abfc-49943b3965c7" containerID="ed2de21fd2b19470325e8545381e16cad275cc4dd73d64068e0b52f4e74c23fc" exitCode=0 Dec 09 05:47:23 crc kubenswrapper[4766]: I1209 05:47:23.379125 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" event={"ID":"8683912f-3cfa-4505-abfc-49943b3965c7","Type":"ContainerDied","Data":"ed2de21fd2b19470325e8545381e16cad275cc4dd73d64068e0b52f4e74c23fc"} Dec 09 05:47:24 crc kubenswrapper[4766]: I1209 05:47:24.978996 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048154 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-1\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048247 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-inventory\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048306 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ceph\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048342 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ssh-key\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048435 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-1\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048466 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvqm4\" (UniqueName: \"kubernetes.io/projected/8683912f-3cfa-4505-abfc-49943b3965c7-kube-api-access-hvqm4\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048498 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-0\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048579 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-0\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048620 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-0\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048662 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-combined-ca-bundle\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.048701 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-1\") pod \"8683912f-3cfa-4505-abfc-49943b3965c7\" (UID: \"8683912f-3cfa-4505-abfc-49943b3965c7\") " Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.061142 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8683912f-3cfa-4505-abfc-49943b3965c7-kube-api-access-hvqm4" (OuterVolumeSpecName: "kube-api-access-hvqm4") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "kube-api-access-hvqm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.061140 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.071421 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ceph" (OuterVolumeSpecName: "ceph") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.081149 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.085631 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.086454 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-inventory" (OuterVolumeSpecName: "inventory") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.086799 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.089541 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.091345 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.094426 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.104193 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "8683912f-3cfa-4505-abfc-49943b3965c7" (UID: "8683912f-3cfa-4505-abfc-49943b3965c7"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152201 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152261 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-inventory\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152275 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ceph\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152286 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152298 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152309 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvqm4\" (UniqueName: \"kubernetes.io/projected/8683912f-3cfa-4505-abfc-49943b3965c7-kube-api-access-hvqm4\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152320 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152331 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152342 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152353 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.152365 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8683912f-3cfa-4505-abfc-49943b3965c7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.402408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" event={"ID":"8683912f-3cfa-4505-abfc-49943b3965c7","Type":"ContainerDied","Data":"42a2c9e4e2128e130a953b5a2c2d13b7248ab006107899c72f5fe1dca94ca19e"} Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.402750 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42a2c9e4e2128e130a953b5a2c2d13b7248ab006107899c72f5fe1dca94ca19e" Dec 09 05:47:25 crc kubenswrapper[4766]: I1209 05:47:25.402517 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v" Dec 09 05:47:29 crc kubenswrapper[4766]: I1209 05:47:29.839116 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:47:29 crc kubenswrapper[4766]: E1209 05:47:29.840092 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:47:42 crc kubenswrapper[4766]: I1209 05:47:42.839907 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:47:42 crc kubenswrapper[4766]: E1209 05:47:42.840783 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.812642 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r6qtn"] Dec 09 05:47:46 crc kubenswrapper[4766]: E1209 05:47:46.813743 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a61d3b4-df1b-4fc9-b471-b8173f01cac3" containerName="collect-profiles" Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.813765 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a61d3b4-df1b-4fc9-b471-b8173f01cac3" containerName="collect-profiles" Dec 09 05:47:46 crc kubenswrapper[4766]: E1209 05:47:46.813827 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8683912f-3cfa-4505-abfc-49943b3965c7" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.813838 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8683912f-3cfa-4505-abfc-49943b3965c7" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.814132 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8683912f-3cfa-4505-abfc-49943b3965c7" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.814157 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a61d3b4-df1b-4fc9-b471-b8173f01cac3" containerName="collect-profiles" Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.816121 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.828142 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6qtn"] Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.967605 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wptrb\" (UniqueName: \"kubernetes.io/projected/d5f71012-27c1-41df-b2e2-40b1819390ab-kube-api-access-wptrb\") pod \"redhat-operators-r6qtn\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.967681 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-catalog-content\") pod \"redhat-operators-r6qtn\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:46 crc kubenswrapper[4766]: I1209 05:47:46.967735 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-utilities\") pod \"redhat-operators-r6qtn\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:47 crc kubenswrapper[4766]: I1209 05:47:47.069410 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wptrb\" (UniqueName: \"kubernetes.io/projected/d5f71012-27c1-41df-b2e2-40b1819390ab-kube-api-access-wptrb\") pod \"redhat-operators-r6qtn\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:47 crc kubenswrapper[4766]: I1209 05:47:47.069753 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-catalog-content\") pod \"redhat-operators-r6qtn\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:47 crc kubenswrapper[4766]: I1209 05:47:47.069804 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-utilities\") pod \"redhat-operators-r6qtn\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:47 crc kubenswrapper[4766]: I1209 05:47:47.070334 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-catalog-content\") pod \"redhat-operators-r6qtn\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:47 crc kubenswrapper[4766]: I1209 05:47:47.070449 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-utilities\") pod \"redhat-operators-r6qtn\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:47 crc kubenswrapper[4766]: I1209 05:47:47.092682 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wptrb\" (UniqueName: \"kubernetes.io/projected/d5f71012-27c1-41df-b2e2-40b1819390ab-kube-api-access-wptrb\") pod \"redhat-operators-r6qtn\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:47 crc kubenswrapper[4766]: I1209 05:47:47.144863 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:47 crc kubenswrapper[4766]: I1209 05:47:47.648565 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6qtn"] Dec 09 05:47:48 crc kubenswrapper[4766]: I1209 05:47:48.704390 4766 generic.go:334] "Generic (PLEG): container finished" podID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerID="7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0" exitCode=0 Dec 09 05:47:48 crc kubenswrapper[4766]: I1209 05:47:48.704946 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6qtn" event={"ID":"d5f71012-27c1-41df-b2e2-40b1819390ab","Type":"ContainerDied","Data":"7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0"} Dec 09 05:47:48 crc kubenswrapper[4766]: I1209 05:47:48.704981 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6qtn" event={"ID":"d5f71012-27c1-41df-b2e2-40b1819390ab","Type":"ContainerStarted","Data":"118b0ecd0a8b56daf1ba25d84e8e2a4650a9a42abbeb95c464f849b3adf81464"} Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.011816 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2ccw6"] Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.014803 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.041819 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ccw6"] Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.117334 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-889zq\" (UniqueName: \"kubernetes.io/projected/3a1871dd-cd0d-4e58-a152-d8606997f73d-kube-api-access-889zq\") pod \"certified-operators-2ccw6\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.117415 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-catalog-content\") pod \"certified-operators-2ccw6\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.117466 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-utilities\") pod \"certified-operators-2ccw6\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.203729 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5dzvf"] Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.207397 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.215505 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dzvf"] Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.219744 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-889zq\" (UniqueName: \"kubernetes.io/projected/3a1871dd-cd0d-4e58-a152-d8606997f73d-kube-api-access-889zq\") pod \"certified-operators-2ccw6\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.219801 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-catalog-content\") pod \"certified-operators-2ccw6\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.219838 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-utilities\") pod \"certified-operators-2ccw6\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.220305 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-catalog-content\") pod \"certified-operators-2ccw6\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.220319 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-utilities\") pod \"certified-operators-2ccw6\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.255861 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-889zq\" (UniqueName: \"kubernetes.io/projected/3a1871dd-cd0d-4e58-a152-d8606997f73d-kube-api-access-889zq\") pod \"certified-operators-2ccw6\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.322143 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwr4s\" (UniqueName: \"kubernetes.io/projected/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-kube-api-access-qwr4s\") pod \"redhat-marketplace-5dzvf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.322239 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-catalog-content\") pod \"redhat-marketplace-5dzvf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.322422 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-utilities\") pod \"redhat-marketplace-5dzvf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.332691 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.424767 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-utilities\") pod \"redhat-marketplace-5dzvf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.424847 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwr4s\" (UniqueName: \"kubernetes.io/projected/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-kube-api-access-qwr4s\") pod \"redhat-marketplace-5dzvf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.424906 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-catalog-content\") pod \"redhat-marketplace-5dzvf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.425307 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-utilities\") pod \"redhat-marketplace-5dzvf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.425338 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-catalog-content\") pod \"redhat-marketplace-5dzvf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.448533 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwr4s\" (UniqueName: \"kubernetes.io/projected/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-kube-api-access-qwr4s\") pod \"redhat-marketplace-5dzvf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.526961 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.737107 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6qtn" event={"ID":"d5f71012-27c1-41df-b2e2-40b1819390ab","Type":"ContainerStarted","Data":"008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5"} Dec 09 05:47:49 crc kubenswrapper[4766]: I1209 05:47:49.940492 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ccw6"] Dec 09 05:47:50 crc kubenswrapper[4766]: I1209 05:47:50.170753 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dzvf"] Dec 09 05:47:50 crc kubenswrapper[4766]: W1209 05:47:50.176056 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd15f16_1a32_467e_b459_5d8eac9fcfcf.slice/crio-e91bf384973e45f620f9b80863fd89f05e499464cddb30e122e328456c5e0c87 WatchSource:0}: Error finding container e91bf384973e45f620f9b80863fd89f05e499464cddb30e122e328456c5e0c87: Status 404 returned error can't find the container with id e91bf384973e45f620f9b80863fd89f05e499464cddb30e122e328456c5e0c87 Dec 09 05:47:50 crc kubenswrapper[4766]: I1209 05:47:50.749475 4766 generic.go:334] "Generic (PLEG): container finished" podID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerID="7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1" exitCode=0 Dec 09 05:47:50 crc kubenswrapper[4766]: I1209 05:47:50.749923 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccw6" event={"ID":"3a1871dd-cd0d-4e58-a152-d8606997f73d","Type":"ContainerDied","Data":"7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1"} Dec 09 05:47:50 crc kubenswrapper[4766]: I1209 05:47:50.749961 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccw6" event={"ID":"3a1871dd-cd0d-4e58-a152-d8606997f73d","Type":"ContainerStarted","Data":"1b6fdd24d475d0f91733a08d42ed2917ea5f415ee1d1aa4b447ef9b5c72bda1f"} Dec 09 05:47:50 crc kubenswrapper[4766]: I1209 05:47:50.751958 4766 generic.go:334] "Generic (PLEG): container finished" podID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerID="3f5a249bb05637cf7519421556064bc9817f9ddb3d02e3a5bca11c98f1497782" exitCode=0 Dec 09 05:47:50 crc kubenswrapper[4766]: I1209 05:47:50.752159 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dzvf" event={"ID":"2cd15f16-1a32-467e-b459-5d8eac9fcfcf","Type":"ContainerDied","Data":"3f5a249bb05637cf7519421556064bc9817f9ddb3d02e3a5bca11c98f1497782"} Dec 09 05:47:50 crc kubenswrapper[4766]: I1209 05:47:50.752461 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dzvf" event={"ID":"2cd15f16-1a32-467e-b459-5d8eac9fcfcf","Type":"ContainerStarted","Data":"e91bf384973e45f620f9b80863fd89f05e499464cddb30e122e328456c5e0c87"} Dec 09 05:47:52 crc kubenswrapper[4766]: I1209 05:47:52.776493 4766 generic.go:334] "Generic (PLEG): container finished" podID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerID="008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5" exitCode=0 Dec 09 05:47:52 crc kubenswrapper[4766]: I1209 05:47:52.776557 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6qtn" event={"ID":"d5f71012-27c1-41df-b2e2-40b1819390ab","Type":"ContainerDied","Data":"008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5"} Dec 09 05:47:52 crc kubenswrapper[4766]: I1209 05:47:52.779576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccw6" event={"ID":"3a1871dd-cd0d-4e58-a152-d8606997f73d","Type":"ContainerStarted","Data":"4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372"} Dec 09 05:47:52 crc kubenswrapper[4766]: I1209 05:47:52.782513 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dzvf" event={"ID":"2cd15f16-1a32-467e-b459-5d8eac9fcfcf","Type":"ContainerStarted","Data":"d575d92b70821749175c5989d5007511a4ca50cf1fef9a280c41e6ba50101ae0"} Dec 09 05:47:54 crc kubenswrapper[4766]: I1209 05:47:54.805519 4766 generic.go:334] "Generic (PLEG): container finished" podID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerID="d575d92b70821749175c5989d5007511a4ca50cf1fef9a280c41e6ba50101ae0" exitCode=0 Dec 09 05:47:54 crc kubenswrapper[4766]: I1209 05:47:54.805594 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dzvf" event={"ID":"2cd15f16-1a32-467e-b459-5d8eac9fcfcf","Type":"ContainerDied","Data":"d575d92b70821749175c5989d5007511a4ca50cf1fef9a280c41e6ba50101ae0"} Dec 09 05:47:54 crc kubenswrapper[4766]: I1209 05:47:54.809867 4766 generic.go:334] "Generic (PLEG): container finished" podID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerID="4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372" exitCode=0 Dec 09 05:47:54 crc kubenswrapper[4766]: I1209 05:47:54.809909 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccw6" event={"ID":"3a1871dd-cd0d-4e58-a152-d8606997f73d","Type":"ContainerDied","Data":"4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372"} Dec 09 05:47:56 crc kubenswrapper[4766]: I1209 05:47:56.841469 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:47:56 crc kubenswrapper[4766]: E1209 05:47:56.842363 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:47:56 crc kubenswrapper[4766]: I1209 05:47:56.887173 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dzvf" event={"ID":"2cd15f16-1a32-467e-b459-5d8eac9fcfcf","Type":"ContainerStarted","Data":"92d6fe8a1215509d3103bcf8f96aef18063c507b365d6aef8b8d149c7c837240"} Dec 09 05:47:56 crc kubenswrapper[4766]: I1209 05:47:56.900260 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6qtn" event={"ID":"d5f71012-27c1-41df-b2e2-40b1819390ab","Type":"ContainerStarted","Data":"d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872"} Dec 09 05:47:56 crc kubenswrapper[4766]: I1209 05:47:56.906585 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccw6" event={"ID":"3a1871dd-cd0d-4e58-a152-d8606997f73d","Type":"ContainerStarted","Data":"692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c"} Dec 09 05:47:56 crc kubenswrapper[4766]: I1209 05:47:56.926238 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5dzvf" podStartSLOduration=2.9776955750000003 podStartE2EDuration="7.926196938s" podCreationTimestamp="2025-12-09 05:47:49 +0000 UTC" firstStartedPulling="2025-12-09 05:47:50.755657402 +0000 UTC m=+9352.464962828" lastFinishedPulling="2025-12-09 05:47:55.704158765 +0000 UTC m=+9357.413464191" observedRunningTime="2025-12-09 05:47:56.925056517 +0000 UTC m=+9358.634361943" watchObservedRunningTime="2025-12-09 05:47:56.926196938 +0000 UTC m=+9358.635502364" Dec 09 05:47:56 crc kubenswrapper[4766]: I1209 05:47:56.954739 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2ccw6" podStartSLOduration=4.062099959 podStartE2EDuration="8.95471464s" podCreationTimestamp="2025-12-09 05:47:48 +0000 UTC" firstStartedPulling="2025-12-09 05:47:50.75224083 +0000 UTC m=+9352.461546256" lastFinishedPulling="2025-12-09 05:47:55.644855521 +0000 UTC m=+9357.354160937" observedRunningTime="2025-12-09 05:47:56.950270579 +0000 UTC m=+9358.659576015" watchObservedRunningTime="2025-12-09 05:47:56.95471464 +0000 UTC m=+9358.664020066" Dec 09 05:47:56 crc kubenswrapper[4766]: I1209 05:47:56.974724 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r6qtn" podStartSLOduration=4.253742801 podStartE2EDuration="10.97470101s" podCreationTimestamp="2025-12-09 05:47:46 +0000 UTC" firstStartedPulling="2025-12-09 05:47:48.706410318 +0000 UTC m=+9350.415715744" lastFinishedPulling="2025-12-09 05:47:55.427368507 +0000 UTC m=+9357.136673953" observedRunningTime="2025-12-09 05:47:56.969559831 +0000 UTC m=+9358.678865267" watchObservedRunningTime="2025-12-09 05:47:56.97470101 +0000 UTC m=+9358.684006436" Dec 09 05:47:57 crc kubenswrapper[4766]: I1209 05:47:57.145332 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:57 crc kubenswrapper[4766]: I1209 05:47:57.145387 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:47:58 crc kubenswrapper[4766]: I1209 05:47:58.202616 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r6qtn" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="registry-server" probeResult="failure" output=< Dec 09 05:47:58 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:47:58 crc kubenswrapper[4766]: > Dec 09 05:47:59 crc kubenswrapper[4766]: I1209 05:47:59.333476 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:59 crc kubenswrapper[4766]: I1209 05:47:59.333814 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:47:59 crc kubenswrapper[4766]: I1209 05:47:59.527584 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:47:59 crc kubenswrapper[4766]: I1209 05:47:59.528005 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:48:00 crc kubenswrapper[4766]: I1209 05:48:00.396888 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2ccw6" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerName="registry-server" probeResult="failure" output=< Dec 09 05:48:00 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:48:00 crc kubenswrapper[4766]: > Dec 09 05:48:00 crc kubenswrapper[4766]: I1209 05:48:00.580916 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5dzvf" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerName="registry-server" probeResult="failure" output=< Dec 09 05:48:00 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:48:00 crc kubenswrapper[4766]: > Dec 09 05:48:08 crc kubenswrapper[4766]: I1209 05:48:08.205764 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r6qtn" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="registry-server" probeResult="failure" output=< Dec 09 05:48:08 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:48:08 crc kubenswrapper[4766]: > Dec 09 05:48:09 crc kubenswrapper[4766]: I1209 05:48:09.379294 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:48:09 crc kubenswrapper[4766]: I1209 05:48:09.443079 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:48:09 crc kubenswrapper[4766]: I1209 05:48:09.585330 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:48:09 crc kubenswrapper[4766]: I1209 05:48:09.641065 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:48:09 crc kubenswrapper[4766]: I1209 05:48:09.654082 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2ccw6"] Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.043992 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2ccw6" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerName="registry-server" containerID="cri-o://692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c" gracePeriod=2 Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.658452 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.751339 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-catalog-content\") pod \"3a1871dd-cd0d-4e58-a152-d8606997f73d\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.751392 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-889zq\" (UniqueName: \"kubernetes.io/projected/3a1871dd-cd0d-4e58-a152-d8606997f73d-kube-api-access-889zq\") pod \"3a1871dd-cd0d-4e58-a152-d8606997f73d\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.751459 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-utilities\") pod \"3a1871dd-cd0d-4e58-a152-d8606997f73d\" (UID: \"3a1871dd-cd0d-4e58-a152-d8606997f73d\") " Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.752510 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-utilities" (OuterVolumeSpecName: "utilities") pod "3a1871dd-cd0d-4e58-a152-d8606997f73d" (UID: "3a1871dd-cd0d-4e58-a152-d8606997f73d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.757493 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a1871dd-cd0d-4e58-a152-d8606997f73d-kube-api-access-889zq" (OuterVolumeSpecName: "kube-api-access-889zq") pod "3a1871dd-cd0d-4e58-a152-d8606997f73d" (UID: "3a1871dd-cd0d-4e58-a152-d8606997f73d"). InnerVolumeSpecName "kube-api-access-889zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.813724 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a1871dd-cd0d-4e58-a152-d8606997f73d" (UID: "3a1871dd-cd0d-4e58-a152-d8606997f73d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.822324 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dzvf"] Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.822613 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5dzvf" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerName="registry-server" containerID="cri-o://92d6fe8a1215509d3103bcf8f96aef18063c507b365d6aef8b8d149c7c837240" gracePeriod=2 Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.839150 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:48:11 crc kubenswrapper[4766]: E1209 05:48:11.839636 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.854988 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.855030 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-889zq\" (UniqueName: \"kubernetes.io/projected/3a1871dd-cd0d-4e58-a152-d8606997f73d-kube-api-access-889zq\") on node \"crc\" DevicePath \"\"" Dec 09 05:48:11 crc kubenswrapper[4766]: I1209 05:48:11.855044 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a1871dd-cd0d-4e58-a152-d8606997f73d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.078184 4766 generic.go:334] "Generic (PLEG): container finished" podID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerID="92d6fe8a1215509d3103bcf8f96aef18063c507b365d6aef8b8d149c7c837240" exitCode=0 Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.078264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dzvf" event={"ID":"2cd15f16-1a32-467e-b459-5d8eac9fcfcf","Type":"ContainerDied","Data":"92d6fe8a1215509d3103bcf8f96aef18063c507b365d6aef8b8d149c7c837240"} Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.080409 4766 generic.go:334] "Generic (PLEG): container finished" podID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerID="692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c" exitCode=0 Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.080445 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccw6" event={"ID":"3a1871dd-cd0d-4e58-a152-d8606997f73d","Type":"ContainerDied","Data":"692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c"} Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.080469 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ccw6" event={"ID":"3a1871dd-cd0d-4e58-a152-d8606997f73d","Type":"ContainerDied","Data":"1b6fdd24d475d0f91733a08d42ed2917ea5f415ee1d1aa4b447ef9b5c72bda1f"} Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.080489 4766 scope.go:117] "RemoveContainer" containerID="692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.080655 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ccw6" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.121535 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2ccw6"] Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.142669 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2ccw6"] Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.145252 4766 scope.go:117] "RemoveContainer" containerID="4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.214400 4766 scope.go:117] "RemoveContainer" containerID="7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.395394 4766 scope.go:117] "RemoveContainer" containerID="692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c" Dec 09 05:48:12 crc kubenswrapper[4766]: E1209 05:48:12.404369 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c\": container with ID starting with 692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c not found: ID does not exist" containerID="692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.404421 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c"} err="failed to get container status \"692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c\": rpc error: code = NotFound desc = could not find container \"692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c\": container with ID starting with 692243dc9466196ff6e17affe4cd5435ce1b9ff8ebcaf3f4b03f0a1fb2a3109c not found: ID does not exist" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.404449 4766 scope.go:117] "RemoveContainer" containerID="4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372" Dec 09 05:48:12 crc kubenswrapper[4766]: E1209 05:48:12.408367 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372\": container with ID starting with 4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372 not found: ID does not exist" containerID="4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.408419 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372"} err="failed to get container status \"4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372\": rpc error: code = NotFound desc = could not find container \"4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372\": container with ID starting with 4a20597fc03208eb11dd3ff09d3767ec7e9505a8486a777752a6fd598e5b6372 not found: ID does not exist" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.408449 4766 scope.go:117] "RemoveContainer" containerID="7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1" Dec 09 05:48:12 crc kubenswrapper[4766]: E1209 05:48:12.409230 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1\": container with ID starting with 7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1 not found: ID does not exist" containerID="7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.409285 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1"} err="failed to get container status \"7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1\": rpc error: code = NotFound desc = could not find container \"7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1\": container with ID starting with 7f92c733629555a2b81f9b49592d3b39c4e341f0ce8c6bf96d834cb11aec88f1 not found: ID does not exist" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.577295 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.583204 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwr4s\" (UniqueName: \"kubernetes.io/projected/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-kube-api-access-qwr4s\") pod \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.583428 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-catalog-content\") pod \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.583457 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-utilities\") pod \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\" (UID: \"2cd15f16-1a32-467e-b459-5d8eac9fcfcf\") " Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.583922 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-utilities" (OuterVolumeSpecName: "utilities") pod "2cd15f16-1a32-467e-b459-5d8eac9fcfcf" (UID: "2cd15f16-1a32-467e-b459-5d8eac9fcfcf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.584188 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.590390 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-kube-api-access-qwr4s" (OuterVolumeSpecName: "kube-api-access-qwr4s") pod "2cd15f16-1a32-467e-b459-5d8eac9fcfcf" (UID: "2cd15f16-1a32-467e-b459-5d8eac9fcfcf"). InnerVolumeSpecName "kube-api-access-qwr4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.602909 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cd15f16-1a32-467e-b459-5d8eac9fcfcf" (UID: "2cd15f16-1a32-467e-b459-5d8eac9fcfcf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.686258 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.686297 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwr4s\" (UniqueName: \"kubernetes.io/projected/2cd15f16-1a32-467e-b459-5d8eac9fcfcf-kube-api-access-qwr4s\") on node \"crc\" DevicePath \"\"" Dec 09 05:48:12 crc kubenswrapper[4766]: I1209 05:48:12.855911 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" path="/var/lib/kubelet/pods/3a1871dd-cd0d-4e58-a152-d8606997f73d/volumes" Dec 09 05:48:13 crc kubenswrapper[4766]: I1209 05:48:13.094448 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dzvf" event={"ID":"2cd15f16-1a32-467e-b459-5d8eac9fcfcf","Type":"ContainerDied","Data":"e91bf384973e45f620f9b80863fd89f05e499464cddb30e122e328456c5e0c87"} Dec 09 05:48:13 crc kubenswrapper[4766]: I1209 05:48:13.094509 4766 scope.go:117] "RemoveContainer" containerID="92d6fe8a1215509d3103bcf8f96aef18063c507b365d6aef8b8d149c7c837240" Dec 09 05:48:13 crc kubenswrapper[4766]: I1209 05:48:13.094507 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dzvf" Dec 09 05:48:13 crc kubenswrapper[4766]: I1209 05:48:13.121454 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dzvf"] Dec 09 05:48:13 crc kubenswrapper[4766]: I1209 05:48:13.137229 4766 scope.go:117] "RemoveContainer" containerID="d575d92b70821749175c5989d5007511a4ca50cf1fef9a280c41e6ba50101ae0" Dec 09 05:48:13 crc kubenswrapper[4766]: I1209 05:48:13.150806 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dzvf"] Dec 09 05:48:13 crc kubenswrapper[4766]: I1209 05:48:13.171347 4766 scope.go:117] "RemoveContainer" containerID="3f5a249bb05637cf7519421556064bc9817f9ddb3d02e3a5bca11c98f1497782" Dec 09 05:48:14 crc kubenswrapper[4766]: I1209 05:48:14.862499 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" path="/var/lib/kubelet/pods/2cd15f16-1a32-467e-b459-5d8eac9fcfcf/volumes" Dec 09 05:48:17 crc kubenswrapper[4766]: I1209 05:48:17.411491 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:48:17 crc kubenswrapper[4766]: I1209 05:48:17.490037 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:48:18 crc kubenswrapper[4766]: I1209 05:48:18.222045 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6qtn"] Dec 09 05:48:19 crc kubenswrapper[4766]: I1209 05:48:19.161568 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r6qtn" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="registry-server" containerID="cri-o://d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872" gracePeriod=2 Dec 09 05:48:19 crc kubenswrapper[4766]: I1209 05:48:19.941759 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.040681 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-catalog-content\") pod \"d5f71012-27c1-41df-b2e2-40b1819390ab\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.040755 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-utilities\") pod \"d5f71012-27c1-41df-b2e2-40b1819390ab\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.040829 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wptrb\" (UniqueName: \"kubernetes.io/projected/d5f71012-27c1-41df-b2e2-40b1819390ab-kube-api-access-wptrb\") pod \"d5f71012-27c1-41df-b2e2-40b1819390ab\" (UID: \"d5f71012-27c1-41df-b2e2-40b1819390ab\") " Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.041571 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-utilities" (OuterVolumeSpecName: "utilities") pod "d5f71012-27c1-41df-b2e2-40b1819390ab" (UID: "d5f71012-27c1-41df-b2e2-40b1819390ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.043303 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.046686 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f71012-27c1-41df-b2e2-40b1819390ab-kube-api-access-wptrb" (OuterVolumeSpecName: "kube-api-access-wptrb") pod "d5f71012-27c1-41df-b2e2-40b1819390ab" (UID: "d5f71012-27c1-41df-b2e2-40b1819390ab"). InnerVolumeSpecName "kube-api-access-wptrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.145280 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wptrb\" (UniqueName: \"kubernetes.io/projected/d5f71012-27c1-41df-b2e2-40b1819390ab-kube-api-access-wptrb\") on node \"crc\" DevicePath \"\"" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.146598 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5f71012-27c1-41df-b2e2-40b1819390ab" (UID: "d5f71012-27c1-41df-b2e2-40b1819390ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.172023 4766 generic.go:334] "Generic (PLEG): container finished" podID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerID="d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872" exitCode=0 Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.172122 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6qtn" event={"ID":"d5f71012-27c1-41df-b2e2-40b1819390ab","Type":"ContainerDied","Data":"d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872"} Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.172172 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6qtn" event={"ID":"d5f71012-27c1-41df-b2e2-40b1819390ab","Type":"ContainerDied","Data":"118b0ecd0a8b56daf1ba25d84e8e2a4650a9a42abbeb95c464f849b3adf81464"} Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.172189 4766 scope.go:117] "RemoveContainer" containerID="d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.172133 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6qtn" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.204743 4766 scope.go:117] "RemoveContainer" containerID="008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.216711 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6qtn"] Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.228185 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r6qtn"] Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.229097 4766 scope.go:117] "RemoveContainer" containerID="7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.247427 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5f71012-27c1-41df-b2e2-40b1819390ab-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.293592 4766 scope.go:117] "RemoveContainer" containerID="d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872" Dec 09 05:48:20 crc kubenswrapper[4766]: E1209 05:48:20.294124 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872\": container with ID starting with d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872 not found: ID does not exist" containerID="d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.294194 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872"} err="failed to get container status \"d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872\": rpc error: code = NotFound desc = could not find container \"d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872\": container with ID starting with d8c13ba2ba020a7cd96992d2ab775db84936f4f9a8d6e6270190f40ce5ba8872 not found: ID does not exist" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.294242 4766 scope.go:117] "RemoveContainer" containerID="008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5" Dec 09 05:48:20 crc kubenswrapper[4766]: E1209 05:48:20.294656 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5\": container with ID starting with 008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5 not found: ID does not exist" containerID="008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.294685 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5"} err="failed to get container status \"008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5\": rpc error: code = NotFound desc = could not find container \"008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5\": container with ID starting with 008d02a9e8ec7fdd88a5b01c9039cbd3fbfcc6df27c14cdb976d567b8d036ba5 not found: ID does not exist" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.294703 4766 scope.go:117] "RemoveContainer" containerID="7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0" Dec 09 05:48:20 crc kubenswrapper[4766]: E1209 05:48:20.295022 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0\": container with ID starting with 7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0 not found: ID does not exist" containerID="7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.295105 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0"} err="failed to get container status \"7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0\": rpc error: code = NotFound desc = could not find container \"7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0\": container with ID starting with 7542ce3bf7309812c9df193635c0583d3d1a023fc44ed536b3fdca23cfa17ba0 not found: ID does not exist" Dec 09 05:48:20 crc kubenswrapper[4766]: I1209 05:48:20.855776 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" path="/var/lib/kubelet/pods/d5f71012-27c1-41df-b2e2-40b1819390ab/volumes" Dec 09 05:48:25 crc kubenswrapper[4766]: I1209 05:48:25.838963 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:48:25 crc kubenswrapper[4766]: E1209 05:48:25.839884 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:48:40 crc kubenswrapper[4766]: I1209 05:48:40.839617 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:48:40 crc kubenswrapper[4766]: E1209 05:48:40.840523 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:48:54 crc kubenswrapper[4766]: I1209 05:48:54.839013 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:48:54 crc kubenswrapper[4766]: E1209 05:48:54.839868 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:49:07 crc kubenswrapper[4766]: I1209 05:49:07.839891 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:49:07 crc kubenswrapper[4766]: E1209 05:49:07.841135 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:49:21 crc kubenswrapper[4766]: I1209 05:49:21.840893 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:49:21 crc kubenswrapper[4766]: E1209 05:49:21.842175 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:49:33 crc kubenswrapper[4766]: I1209 05:49:33.839604 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:49:33 crc kubenswrapper[4766]: E1209 05:49:33.840596 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:49:40 crc kubenswrapper[4766]: I1209 05:49:40.148126 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 05:49:40 crc kubenswrapper[4766]: I1209 05:49:40.148833 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="1900c1e7-52f3-47a4-933d-84b2816bf2e8" containerName="adoption" containerID="cri-o://687a52b4967a2e88c68dbdbfccfd77b69ca1c9da06e35498ff2363c2f7d7d096" gracePeriod=30 Dec 09 05:49:47 crc kubenswrapper[4766]: I1209 05:49:47.840169 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:49:47 crc kubenswrapper[4766]: E1209 05:49:47.841375 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:50:02 crc kubenswrapper[4766]: I1209 05:50:02.839358 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:50:02 crc kubenswrapper[4766]: E1209 05:50:02.842050 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.432472 4766 generic.go:334] "Generic (PLEG): container finished" podID="1900c1e7-52f3-47a4-933d-84b2816bf2e8" containerID="687a52b4967a2e88c68dbdbfccfd77b69ca1c9da06e35498ff2363c2f7d7d096" exitCode=137 Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.432562 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1900c1e7-52f3-47a4-933d-84b2816bf2e8","Type":"ContainerDied","Data":"687a52b4967a2e88c68dbdbfccfd77b69ca1c9da06e35498ff2363c2f7d7d096"} Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.607108 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.700362 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\") pod \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\" (UID: \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\") " Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.700470 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k78s\" (UniqueName: \"kubernetes.io/projected/1900c1e7-52f3-47a4-933d-84b2816bf2e8-kube-api-access-5k78s\") pod \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\" (UID: \"1900c1e7-52f3-47a4-933d-84b2816bf2e8\") " Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.706625 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1900c1e7-52f3-47a4-933d-84b2816bf2e8-kube-api-access-5k78s" (OuterVolumeSpecName: "kube-api-access-5k78s") pod "1900c1e7-52f3-47a4-933d-84b2816bf2e8" (UID: "1900c1e7-52f3-47a4-933d-84b2816bf2e8"). InnerVolumeSpecName "kube-api-access-5k78s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.726380 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7" (OuterVolumeSpecName: "mariadb-data") pod "1900c1e7-52f3-47a4-933d-84b2816bf2e8" (UID: "1900c1e7-52f3-47a4-933d-84b2816bf2e8"). InnerVolumeSpecName "pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.802862 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\") on node \"crc\" " Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.802903 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k78s\" (UniqueName: \"kubernetes.io/projected/1900c1e7-52f3-47a4-933d-84b2816bf2e8-kube-api-access-5k78s\") on node \"crc\" DevicePath \"\"" Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.832892 4766 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.833055 4766 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7") on node "crc" Dec 09 05:50:10 crc kubenswrapper[4766]: I1209 05:50:10.905520 4766 reconciler_common.go:293] "Volume detached for volume \"pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c048a4ac-b1bf-45fc-a795-1c9f7deb04a7\") on node \"crc\" DevicePath \"\"" Dec 09 05:50:11 crc kubenswrapper[4766]: I1209 05:50:11.446935 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1900c1e7-52f3-47a4-933d-84b2816bf2e8","Type":"ContainerDied","Data":"b42181778f66570a909cf354a41016fea4918cb35792321e397d8edcbab4adba"} Dec 09 05:50:11 crc kubenswrapper[4766]: I1209 05:50:11.447021 4766 scope.go:117] "RemoveContainer" containerID="687a52b4967a2e88c68dbdbfccfd77b69ca1c9da06e35498ff2363c2f7d7d096" Dec 09 05:50:11 crc kubenswrapper[4766]: I1209 05:50:11.447065 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 09 05:50:11 crc kubenswrapper[4766]: I1209 05:50:11.484311 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 05:50:11 crc kubenswrapper[4766]: I1209 05:50:11.503108 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 09 05:50:12 crc kubenswrapper[4766]: I1209 05:50:12.221716 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 05:50:12 crc kubenswrapper[4766]: I1209 05:50:12.221983 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="2b86ac1c-358a-4c50-9d62-f39f667a1d6d" containerName="adoption" containerID="cri-o://1b7ab55a4c6f10c9e96036b4d7e4aec07a6a06709402a67aee4da01ca36b5fe3" gracePeriod=30 Dec 09 05:50:12 crc kubenswrapper[4766]: I1209 05:50:12.863532 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1900c1e7-52f3-47a4-933d-84b2816bf2e8" path="/var/lib/kubelet/pods/1900c1e7-52f3-47a4-933d-84b2816bf2e8/volumes" Dec 09 05:50:15 crc kubenswrapper[4766]: I1209 05:50:15.839959 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:50:16 crc kubenswrapper[4766]: I1209 05:50:16.514174 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"a91ffe188e8db06586ea61e7c1b2afd36c11fb9b8087b7474593627431c68daf"} Dec 09 05:50:42 crc kubenswrapper[4766]: I1209 05:50:42.803598 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b86ac1c-358a-4c50-9d62-f39f667a1d6d" containerID="1b7ab55a4c6f10c9e96036b4d7e4aec07a6a06709402a67aee4da01ca36b5fe3" exitCode=137 Dec 09 05:50:42 crc kubenswrapper[4766]: I1209 05:50:42.803704 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"2b86ac1c-358a-4c50-9d62-f39f667a1d6d","Type":"ContainerDied","Data":"1b7ab55a4c6f10c9e96036b4d7e4aec07a6a06709402a67aee4da01ca36b5fe3"} Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.199789 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.300778 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l598\" (UniqueName: \"kubernetes.io/projected/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-kube-api-access-8l598\") pod \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.301235 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4be9a001-60db-4cc6-a436-433efee77f20\") pod \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.301329 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-ovn-data-cert\") pod \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\" (UID: \"2b86ac1c-358a-4c50-9d62-f39f667a1d6d\") " Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.307536 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-kube-api-access-8l598" (OuterVolumeSpecName: "kube-api-access-8l598") pod "2b86ac1c-358a-4c50-9d62-f39f667a1d6d" (UID: "2b86ac1c-358a-4c50-9d62-f39f667a1d6d"). InnerVolumeSpecName "kube-api-access-8l598". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.308071 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "2b86ac1c-358a-4c50-9d62-f39f667a1d6d" (UID: "2b86ac1c-358a-4c50-9d62-f39f667a1d6d"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.322902 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4be9a001-60db-4cc6-a436-433efee77f20" (OuterVolumeSpecName: "ovn-data") pod "2b86ac1c-358a-4c50-9d62-f39f667a1d6d" (UID: "2b86ac1c-358a-4c50-9d62-f39f667a1d6d"). InnerVolumeSpecName "pvc-4be9a001-60db-4cc6-a436-433efee77f20". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.403971 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l598\" (UniqueName: \"kubernetes.io/projected/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-kube-api-access-8l598\") on node \"crc\" DevicePath \"\"" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.404029 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4be9a001-60db-4cc6-a436-433efee77f20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4be9a001-60db-4cc6-a436-433efee77f20\") on node \"crc\" " Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.404042 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/2b86ac1c-358a-4c50-9d62-f39f667a1d6d-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.430547 4766 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.430682 4766 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4be9a001-60db-4cc6-a436-433efee77f20" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4be9a001-60db-4cc6-a436-433efee77f20") on node "crc" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.507127 4766 reconciler_common.go:293] "Volume detached for volume \"pvc-4be9a001-60db-4cc6-a436-433efee77f20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4be9a001-60db-4cc6-a436-433efee77f20\") on node \"crc\" DevicePath \"\"" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.817847 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"2b86ac1c-358a-4c50-9d62-f39f667a1d6d","Type":"ContainerDied","Data":"87152dc48e1c1e89eca0b8c5cf9c634826066efc61c47ccf632b1ea6861fb1b6"} Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.817906 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.818115 4766 scope.go:117] "RemoveContainer" containerID="1b7ab55a4c6f10c9e96036b4d7e4aec07a6a06709402a67aee4da01ca36b5fe3" Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.864773 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 05:50:43 crc kubenswrapper[4766]: I1209 05:50:43.879115 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 09 05:50:44 crc kubenswrapper[4766]: I1209 05:50:44.852487 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b86ac1c-358a-4c50-9d62-f39f667a1d6d" path="/var/lib/kubelet/pods/2b86ac1c-358a-4c50-9d62-f39f667a1d6d/volumes" Dec 09 05:51:27 crc kubenswrapper[4766]: I1209 05:51:27.915893 4766 scope.go:117] "RemoveContainer" containerID="ef3e7a4198b51179020ac1aa117b21bcda0a7dd72fdacadb300d6219f91157c8" Dec 09 05:51:27 crc kubenswrapper[4766]: I1209 05:51:27.959725 4766 scope.go:117] "RemoveContainer" containerID="7e3c3b09fa1e20df9513867d5f8f5ce9c4c9d406fbe6de5a812316224b1a1e1f" Dec 09 05:51:28 crc kubenswrapper[4766]: I1209 05:51:28.040921 4766 scope.go:117] "RemoveContainer" containerID="740b7d6688d980c1b07d17d2abee08e68c3f329425dd3e7faf68afb4d2ff645b" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.555788 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-47jdn/must-gather-s8k2m"] Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556727 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="extract-utilities" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556741 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="extract-utilities" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556760 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="extract-content" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556767 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="extract-content" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556788 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerName="registry-server" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556794 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerName="registry-server" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556806 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerName="extract-utilities" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556811 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerName="extract-utilities" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556821 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerName="extract-content" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556827 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerName="extract-content" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556835 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerName="registry-server" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556840 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerName="registry-server" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556863 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerName="extract-utilities" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556869 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerName="extract-utilities" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556877 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="registry-server" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556882 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="registry-server" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556895 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b86ac1c-358a-4c50-9d62-f39f667a1d6d" containerName="adoption" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556903 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b86ac1c-358a-4c50-9d62-f39f667a1d6d" containerName="adoption" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556913 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1900c1e7-52f3-47a4-933d-84b2816bf2e8" containerName="adoption" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556919 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1900c1e7-52f3-47a4-933d-84b2816bf2e8" containerName="adoption" Dec 09 05:51:51 crc kubenswrapper[4766]: E1209 05:51:51.556933 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerName="extract-content" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.556939 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerName="extract-content" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.557119 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b86ac1c-358a-4c50-9d62-f39f667a1d6d" containerName="adoption" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.557133 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a1871dd-cd0d-4e58-a152-d8606997f73d" containerName="registry-server" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.557178 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd15f16-1a32-467e-b459-5d8eac9fcfcf" containerName="registry-server" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.557190 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f71012-27c1-41df-b2e2-40b1819390ab" containerName="registry-server" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.557208 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1900c1e7-52f3-47a4-933d-84b2816bf2e8" containerName="adoption" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.558384 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.560245 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-47jdn"/"kube-root-ca.crt" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.560332 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-47jdn"/"default-dockercfg-jcl2f" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.560419 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-47jdn"/"openshift-service-ca.crt" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.579775 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-47jdn/must-gather-s8k2m"] Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.721400 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm5qp\" (UniqueName: \"kubernetes.io/projected/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-kube-api-access-bm5qp\") pod \"must-gather-s8k2m\" (UID: \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\") " pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.721723 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-must-gather-output\") pod \"must-gather-s8k2m\" (UID: \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\") " pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.823680 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm5qp\" (UniqueName: \"kubernetes.io/projected/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-kube-api-access-bm5qp\") pod \"must-gather-s8k2m\" (UID: \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\") " pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.823743 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-must-gather-output\") pod \"must-gather-s8k2m\" (UID: \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\") " pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 05:51:51 crc kubenswrapper[4766]: I1209 05:51:51.824189 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-must-gather-output\") pod \"must-gather-s8k2m\" (UID: \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\") " pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 05:51:52 crc kubenswrapper[4766]: I1209 05:51:52.258278 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm5qp\" (UniqueName: \"kubernetes.io/projected/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-kube-api-access-bm5qp\") pod \"must-gather-s8k2m\" (UID: \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\") " pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 05:51:52 crc kubenswrapper[4766]: I1209 05:51:52.480460 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 05:51:53 crc kubenswrapper[4766]: I1209 05:51:53.078659 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-47jdn/must-gather-s8k2m"] Dec 09 05:51:53 crc kubenswrapper[4766]: I1209 05:51:53.085123 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:51:53 crc kubenswrapper[4766]: I1209 05:51:53.636430 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47jdn/must-gather-s8k2m" event={"ID":"b04d667e-4be1-4832-b9f0-bf6942ff1c7e","Type":"ContainerStarted","Data":"f5ea97edec06dd179682a8011f68717770f85764be9922a7b742246103e2c12d"} Dec 09 05:51:59 crc kubenswrapper[4766]: I1209 05:51:59.712255 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47jdn/must-gather-s8k2m" event={"ID":"b04d667e-4be1-4832-b9f0-bf6942ff1c7e","Type":"ContainerStarted","Data":"5eb40f6f50a2b56a89e52b841904030db4f72e34c0a2c482140ba3d8c013d6b3"} Dec 09 05:52:00 crc kubenswrapper[4766]: I1209 05:52:00.723792 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47jdn/must-gather-s8k2m" event={"ID":"b04d667e-4be1-4832-b9f0-bf6942ff1c7e","Type":"ContainerStarted","Data":"75395de82e0940bd7d099e90c1f737824260bbd0ff3437841632eb95fd3045e1"} Dec 09 05:52:00 crc kubenswrapper[4766]: I1209 05:52:00.763124 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-47jdn/must-gather-s8k2m" podStartSLOduration=3.486964626 podStartE2EDuration="9.763101489s" podCreationTimestamp="2025-12-09 05:51:51 +0000 UTC" firstStartedPulling="2025-12-09 05:51:53.084766219 +0000 UTC m=+9594.794071655" lastFinishedPulling="2025-12-09 05:51:59.360903092 +0000 UTC m=+9601.070208518" observedRunningTime="2025-12-09 05:52:00.738862703 +0000 UTC m=+9602.448168139" watchObservedRunningTime="2025-12-09 05:52:00.763101489 +0000 UTC m=+9602.472406925" Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.054954 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-47jdn/crc-debug-8nvfc"] Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.057052 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.142373 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwfds\" (UniqueName: \"kubernetes.io/projected/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-kube-api-access-gwfds\") pod \"crc-debug-8nvfc\" (UID: \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\") " pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.142490 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-host\") pod \"crc-debug-8nvfc\" (UID: \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\") " pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.244125 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwfds\" (UniqueName: \"kubernetes.io/projected/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-kube-api-access-gwfds\") pod \"crc-debug-8nvfc\" (UID: \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\") " pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.244284 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-host\") pod \"crc-debug-8nvfc\" (UID: \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\") " pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.244371 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-host\") pod \"crc-debug-8nvfc\" (UID: \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\") " pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.273023 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwfds\" (UniqueName: \"kubernetes.io/projected/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-kube-api-access-gwfds\") pod \"crc-debug-8nvfc\" (UID: \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\") " pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.377928 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:05 crc kubenswrapper[4766]: I1209 05:52:05.782879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47jdn/crc-debug-8nvfc" event={"ID":"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc","Type":"ContainerStarted","Data":"6ef6e2bb836b8659023237971e4793872ecf5307b0677521fb269e0096db6834"} Dec 09 05:52:16 crc kubenswrapper[4766]: I1209 05:52:16.918337 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47jdn/crc-debug-8nvfc" event={"ID":"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc","Type":"ContainerStarted","Data":"1b3a0f2428ca875a234b418c6d8a7724d25e5dba181181560930914d53818415"} Dec 09 05:52:16 crc kubenswrapper[4766]: I1209 05:52:16.939530 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-47jdn/crc-debug-8nvfc" podStartSLOduration=0.936250871 podStartE2EDuration="11.939516048s" podCreationTimestamp="2025-12-09 05:52:05 +0000 UTC" firstStartedPulling="2025-12-09 05:52:05.437476496 +0000 UTC m=+9607.146781922" lastFinishedPulling="2025-12-09 05:52:16.440741683 +0000 UTC m=+9618.150047099" observedRunningTime="2025-12-09 05:52:16.932806626 +0000 UTC m=+9618.642112052" watchObservedRunningTime="2025-12-09 05:52:16.939516048 +0000 UTC m=+9618.648821474" Dec 09 05:52:37 crc kubenswrapper[4766]: I1209 05:52:37.316466 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:52:37 crc kubenswrapper[4766]: I1209 05:52:37.317870 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:52:38 crc kubenswrapper[4766]: I1209 05:52:38.137055 4766 generic.go:334] "Generic (PLEG): container finished" podID="c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc" containerID="1b3a0f2428ca875a234b418c6d8a7724d25e5dba181181560930914d53818415" exitCode=0 Dec 09 05:52:38 crc kubenswrapper[4766]: I1209 05:52:38.137317 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47jdn/crc-debug-8nvfc" event={"ID":"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc","Type":"ContainerDied","Data":"1b3a0f2428ca875a234b418c6d8a7724d25e5dba181181560930914d53818415"} Dec 09 05:52:39 crc kubenswrapper[4766]: I1209 05:52:39.272337 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:39 crc kubenswrapper[4766]: I1209 05:52:39.305593 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-47jdn/crc-debug-8nvfc"] Dec 09 05:52:39 crc kubenswrapper[4766]: I1209 05:52:39.320532 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-47jdn/crc-debug-8nvfc"] Dec 09 05:52:39 crc kubenswrapper[4766]: I1209 05:52:39.450724 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-host\") pod \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\" (UID: \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\") " Dec 09 05:52:39 crc kubenswrapper[4766]: I1209 05:52:39.450846 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwfds\" (UniqueName: \"kubernetes.io/projected/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-kube-api-access-gwfds\") pod \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\" (UID: \"c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc\") " Dec 09 05:52:39 crc kubenswrapper[4766]: I1209 05:52:39.451531 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-host" (OuterVolumeSpecName: "host") pod "c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc" (UID: "c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 05:52:39 crc kubenswrapper[4766]: I1209 05:52:39.465632 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-kube-api-access-gwfds" (OuterVolumeSpecName: "kube-api-access-gwfds") pod "c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc" (UID: "c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc"). InnerVolumeSpecName "kube-api-access-gwfds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:52:39 crc kubenswrapper[4766]: I1209 05:52:39.553560 4766 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-host\") on node \"crc\" DevicePath \"\"" Dec 09 05:52:39 crc kubenswrapper[4766]: I1209 05:52:39.553598 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwfds\" (UniqueName: \"kubernetes.io/projected/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc-kube-api-access-gwfds\") on node \"crc\" DevicePath \"\"" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.156760 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ef6e2bb836b8659023237971e4793872ecf5307b0677521fb269e0096db6834" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.156874 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/crc-debug-8nvfc" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.505517 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-47jdn/crc-debug-vjs2n"] Dec 09 05:52:40 crc kubenswrapper[4766]: E1209 05:52:40.506368 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc" containerName="container-00" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.506383 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc" containerName="container-00" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.506645 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc" containerName="container-00" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.507672 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.677029 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gntm6\" (UniqueName: \"kubernetes.io/projected/2b708883-bc3a-41a4-8b40-1b2410f473e5-kube-api-access-gntm6\") pod \"crc-debug-vjs2n\" (UID: \"2b708883-bc3a-41a4-8b40-1b2410f473e5\") " pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.677258 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b708883-bc3a-41a4-8b40-1b2410f473e5-host\") pod \"crc-debug-vjs2n\" (UID: \"2b708883-bc3a-41a4-8b40-1b2410f473e5\") " pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.779376 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b708883-bc3a-41a4-8b40-1b2410f473e5-host\") pod \"crc-debug-vjs2n\" (UID: \"2b708883-bc3a-41a4-8b40-1b2410f473e5\") " pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.779488 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gntm6\" (UniqueName: \"kubernetes.io/projected/2b708883-bc3a-41a4-8b40-1b2410f473e5-kube-api-access-gntm6\") pod \"crc-debug-vjs2n\" (UID: \"2b708883-bc3a-41a4-8b40-1b2410f473e5\") " pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.779496 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b708883-bc3a-41a4-8b40-1b2410f473e5-host\") pod \"crc-debug-vjs2n\" (UID: \"2b708883-bc3a-41a4-8b40-1b2410f473e5\") " pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.799916 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntm6\" (UniqueName: \"kubernetes.io/projected/2b708883-bc3a-41a4-8b40-1b2410f473e5-kube-api-access-gntm6\") pod \"crc-debug-vjs2n\" (UID: \"2b708883-bc3a-41a4-8b40-1b2410f473e5\") " pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.828857 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:52:40 crc kubenswrapper[4766]: I1209 05:52:40.858414 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc" path="/var/lib/kubelet/pods/c6ccecb6-9ed3-4d42-89d1-9077df8aa3dc/volumes" Dec 09 05:52:41 crc kubenswrapper[4766]: I1209 05:52:41.168391 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b708883-bc3a-41a4-8b40-1b2410f473e5" containerID="95b204459935df813fcd30c349b63bcd891fa3b961ed78a6edf9587a44d5f223" exitCode=1 Dec 09 05:52:41 crc kubenswrapper[4766]: I1209 05:52:41.168593 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47jdn/crc-debug-vjs2n" event={"ID":"2b708883-bc3a-41a4-8b40-1b2410f473e5","Type":"ContainerDied","Data":"95b204459935df813fcd30c349b63bcd891fa3b961ed78a6edf9587a44d5f223"} Dec 09 05:52:41 crc kubenswrapper[4766]: I1209 05:52:41.168683 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47jdn/crc-debug-vjs2n" event={"ID":"2b708883-bc3a-41a4-8b40-1b2410f473e5","Type":"ContainerStarted","Data":"a9892676631cae17d16b105551b504dfc526c19cea76641c1bb24d7855cfa1b1"} Dec 09 05:52:41 crc kubenswrapper[4766]: I1209 05:52:41.222350 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-47jdn/crc-debug-vjs2n"] Dec 09 05:52:41 crc kubenswrapper[4766]: I1209 05:52:41.236593 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-47jdn/crc-debug-vjs2n"] Dec 09 05:52:42 crc kubenswrapper[4766]: I1209 05:52:42.301885 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:52:42 crc kubenswrapper[4766]: I1209 05:52:42.411651 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gntm6\" (UniqueName: \"kubernetes.io/projected/2b708883-bc3a-41a4-8b40-1b2410f473e5-kube-api-access-gntm6\") pod \"2b708883-bc3a-41a4-8b40-1b2410f473e5\" (UID: \"2b708883-bc3a-41a4-8b40-1b2410f473e5\") " Dec 09 05:52:42 crc kubenswrapper[4766]: I1209 05:52:42.411766 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b708883-bc3a-41a4-8b40-1b2410f473e5-host\") pod \"2b708883-bc3a-41a4-8b40-1b2410f473e5\" (UID: \"2b708883-bc3a-41a4-8b40-1b2410f473e5\") " Dec 09 05:52:42 crc kubenswrapper[4766]: I1209 05:52:42.411907 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b708883-bc3a-41a4-8b40-1b2410f473e5-host" (OuterVolumeSpecName: "host") pod "2b708883-bc3a-41a4-8b40-1b2410f473e5" (UID: "2b708883-bc3a-41a4-8b40-1b2410f473e5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 05:52:42 crc kubenswrapper[4766]: I1209 05:52:42.412624 4766 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b708883-bc3a-41a4-8b40-1b2410f473e5-host\") on node \"crc\" DevicePath \"\"" Dec 09 05:52:42 crc kubenswrapper[4766]: I1209 05:52:42.419575 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b708883-bc3a-41a4-8b40-1b2410f473e5-kube-api-access-gntm6" (OuterVolumeSpecName: "kube-api-access-gntm6") pod "2b708883-bc3a-41a4-8b40-1b2410f473e5" (UID: "2b708883-bc3a-41a4-8b40-1b2410f473e5"). InnerVolumeSpecName "kube-api-access-gntm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:52:42 crc kubenswrapper[4766]: I1209 05:52:42.514870 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gntm6\" (UniqueName: \"kubernetes.io/projected/2b708883-bc3a-41a4-8b40-1b2410f473e5-kube-api-access-gntm6\") on node \"crc\" DevicePath \"\"" Dec 09 05:52:42 crc kubenswrapper[4766]: I1209 05:52:42.851520 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b708883-bc3a-41a4-8b40-1b2410f473e5" path="/var/lib/kubelet/pods/2b708883-bc3a-41a4-8b40-1b2410f473e5/volumes" Dec 09 05:52:43 crc kubenswrapper[4766]: I1209 05:52:43.193131 4766 scope.go:117] "RemoveContainer" containerID="95b204459935df813fcd30c349b63bcd891fa3b961ed78a6edf9587a44d5f223" Dec 09 05:52:43 crc kubenswrapper[4766]: I1209 05:52:43.193191 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/crc-debug-vjs2n" Dec 09 05:53:07 crc kubenswrapper[4766]: I1209 05:53:07.315992 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:53:07 crc kubenswrapper[4766]: I1209 05:53:07.316580 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:53:37 crc kubenswrapper[4766]: I1209 05:53:37.316263 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:53:37 crc kubenswrapper[4766]: I1209 05:53:37.316753 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:53:37 crc kubenswrapper[4766]: I1209 05:53:37.316789 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:53:37 crc kubenswrapper[4766]: I1209 05:53:37.317613 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a91ffe188e8db06586ea61e7c1b2afd36c11fb9b8087b7474593627431c68daf"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:53:37 crc kubenswrapper[4766]: I1209 05:53:37.317666 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://a91ffe188e8db06586ea61e7c1b2afd36c11fb9b8087b7474593627431c68daf" gracePeriod=600 Dec 09 05:53:37 crc kubenswrapper[4766]: I1209 05:53:37.780096 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="a91ffe188e8db06586ea61e7c1b2afd36c11fb9b8087b7474593627431c68daf" exitCode=0 Dec 09 05:53:37 crc kubenswrapper[4766]: I1209 05:53:37.780165 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"a91ffe188e8db06586ea61e7c1b2afd36c11fb9b8087b7474593627431c68daf"} Dec 09 05:53:37 crc kubenswrapper[4766]: I1209 05:53:37.780722 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86"} Dec 09 05:53:37 crc kubenswrapper[4766]: I1209 05:53:37.780746 4766 scope.go:117] "RemoveContainer" containerID="ebefbaa2591f49abe065d09d0ad525f77cc273a3129eba08947cb8c05e9d4d0a" Dec 09 05:55:37 crc kubenswrapper[4766]: I1209 05:55:37.319205 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:55:37 crc kubenswrapper[4766]: I1209 05:55:37.320110 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.722381 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d5d2v"] Dec 09 05:55:55 crc kubenswrapper[4766]: E1209 05:55:55.724916 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b708883-bc3a-41a4-8b40-1b2410f473e5" containerName="container-00" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.725086 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b708883-bc3a-41a4-8b40-1b2410f473e5" containerName="container-00" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.725627 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b708883-bc3a-41a4-8b40-1b2410f473e5" containerName="container-00" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.727994 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.731529 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5d2v"] Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.804423 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk7jp\" (UniqueName: \"kubernetes.io/projected/e0dd3ab4-6d2e-43ff-a318-112537d04862-kube-api-access-vk7jp\") pod \"community-operators-d5d2v\" (UID: \"e0dd3ab4-6d2e-43ff-a318-112537d04862\") " pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.804541 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0dd3ab4-6d2e-43ff-a318-112537d04862-catalog-content\") pod \"community-operators-d5d2v\" (UID: \"e0dd3ab4-6d2e-43ff-a318-112537d04862\") " pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.804635 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0dd3ab4-6d2e-43ff-a318-112537d04862-utilities\") pod \"community-operators-d5d2v\" (UID: \"e0dd3ab4-6d2e-43ff-a318-112537d04862\") " pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.906594 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk7jp\" (UniqueName: \"kubernetes.io/projected/e0dd3ab4-6d2e-43ff-a318-112537d04862-kube-api-access-vk7jp\") pod \"community-operators-d5d2v\" (UID: \"e0dd3ab4-6d2e-43ff-a318-112537d04862\") " pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.906702 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0dd3ab4-6d2e-43ff-a318-112537d04862-catalog-content\") pod \"community-operators-d5d2v\" (UID: \"e0dd3ab4-6d2e-43ff-a318-112537d04862\") " pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.906786 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0dd3ab4-6d2e-43ff-a318-112537d04862-utilities\") pod \"community-operators-d5d2v\" (UID: \"e0dd3ab4-6d2e-43ff-a318-112537d04862\") " pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.907225 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0dd3ab4-6d2e-43ff-a318-112537d04862-utilities\") pod \"community-operators-d5d2v\" (UID: \"e0dd3ab4-6d2e-43ff-a318-112537d04862\") " pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.908342 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0dd3ab4-6d2e-43ff-a318-112537d04862-catalog-content\") pod \"community-operators-d5d2v\" (UID: \"e0dd3ab4-6d2e-43ff-a318-112537d04862\") " pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:55 crc kubenswrapper[4766]: I1209 05:55:55.925170 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk7jp\" (UniqueName: \"kubernetes.io/projected/e0dd3ab4-6d2e-43ff-a318-112537d04862-kube-api-access-vk7jp\") pod \"community-operators-d5d2v\" (UID: \"e0dd3ab4-6d2e-43ff-a318-112537d04862\") " pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:56 crc kubenswrapper[4766]: I1209 05:55:56.085438 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:55:56 crc kubenswrapper[4766]: I1209 05:55:56.568652 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5d2v"] Dec 09 05:55:56 crc kubenswrapper[4766]: W1209 05:55:56.575367 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0dd3ab4_6d2e_43ff_a318_112537d04862.slice/crio-6cb64edd656bcd56ee60a5df73851ea286e49eb5f777e07be6caa8e34fc4fe27 WatchSource:0}: Error finding container 6cb64edd656bcd56ee60a5df73851ea286e49eb5f777e07be6caa8e34fc4fe27: Status 404 returned error can't find the container with id 6cb64edd656bcd56ee60a5df73851ea286e49eb5f777e07be6caa8e34fc4fe27 Dec 09 05:55:57 crc kubenswrapper[4766]: I1209 05:55:57.425181 4766 generic.go:334] "Generic (PLEG): container finished" podID="e0dd3ab4-6d2e-43ff-a318-112537d04862" containerID="710603a0f88f975bb437ba4a9b44d173b51152134f82904a66c933c375ea6d53" exitCode=0 Dec 09 05:55:57 crc kubenswrapper[4766]: I1209 05:55:57.425266 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5d2v" event={"ID":"e0dd3ab4-6d2e-43ff-a318-112537d04862","Type":"ContainerDied","Data":"710603a0f88f975bb437ba4a9b44d173b51152134f82904a66c933c375ea6d53"} Dec 09 05:55:57 crc kubenswrapper[4766]: I1209 05:55:57.427382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5d2v" event={"ID":"e0dd3ab4-6d2e-43ff-a318-112537d04862","Type":"ContainerStarted","Data":"6cb64edd656bcd56ee60a5df73851ea286e49eb5f777e07be6caa8e34fc4fe27"} Dec 09 05:56:02 crc kubenswrapper[4766]: I1209 05:56:02.493152 4766 generic.go:334] "Generic (PLEG): container finished" podID="e0dd3ab4-6d2e-43ff-a318-112537d04862" containerID="61879c704fe7c0e4a385e68cac85ede70bf43ec65850f0690bedcd0b152df103" exitCode=0 Dec 09 05:56:02 crc kubenswrapper[4766]: I1209 05:56:02.493286 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5d2v" event={"ID":"e0dd3ab4-6d2e-43ff-a318-112537d04862","Type":"ContainerDied","Data":"61879c704fe7c0e4a385e68cac85ede70bf43ec65850f0690bedcd0b152df103"} Dec 09 05:56:04 crc kubenswrapper[4766]: I1209 05:56:04.520579 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5d2v" event={"ID":"e0dd3ab4-6d2e-43ff-a318-112537d04862","Type":"ContainerStarted","Data":"f1a5445e5d4518c67204a0c42f2dae6746f75888b94fc94f2205751af9a90ca7"} Dec 09 05:56:04 crc kubenswrapper[4766]: I1209 05:56:04.550594 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d5d2v" podStartSLOduration=4.047518766 podStartE2EDuration="9.550571783s" podCreationTimestamp="2025-12-09 05:55:55 +0000 UTC" firstStartedPulling="2025-12-09 05:55:57.428200185 +0000 UTC m=+9839.137505601" lastFinishedPulling="2025-12-09 05:56:02.931253192 +0000 UTC m=+9844.640558618" observedRunningTime="2025-12-09 05:56:04.547377817 +0000 UTC m=+9846.256683273" watchObservedRunningTime="2025-12-09 05:56:04.550571783 +0000 UTC m=+9846.259877219" Dec 09 05:56:06 crc kubenswrapper[4766]: I1209 05:56:06.086048 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:56:06 crc kubenswrapper[4766]: I1209 05:56:06.086407 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:56:06 crc kubenswrapper[4766]: I1209 05:56:06.148193 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:56:07 crc kubenswrapper[4766]: I1209 05:56:07.316858 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:56:07 crc kubenswrapper[4766]: I1209 05:56:07.317312 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.154999 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d5d2v" Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.247072 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5d2v"] Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.303464 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pt6rh"] Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.303806 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pt6rh" podUID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerName="registry-server" containerID="cri-o://5f99dbbc13ee0489976e6eee73f09e7f039e218f91aee1be5a7940a48887d3fa" gracePeriod=2 Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.672939 4766 generic.go:334] "Generic (PLEG): container finished" podID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerID="5f99dbbc13ee0489976e6eee73f09e7f039e218f91aee1be5a7940a48887d3fa" exitCode=0 Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.672999 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pt6rh" event={"ID":"8aadd3b6-2c6f-414f-ab7f-bbd82c568689","Type":"ContainerDied","Data":"5f99dbbc13ee0489976e6eee73f09e7f039e218f91aee1be5a7940a48887d3fa"} Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.792989 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pt6rh" Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.845439 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2rdg\" (UniqueName: \"kubernetes.io/projected/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-kube-api-access-f2rdg\") pod \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.845677 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-catalog-content\") pod \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.845870 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-utilities\") pod \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\" (UID: \"8aadd3b6-2c6f-414f-ab7f-bbd82c568689\") " Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.847517 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-utilities" (OuterVolumeSpecName: "utilities") pod "8aadd3b6-2c6f-414f-ab7f-bbd82c568689" (UID: "8aadd3b6-2c6f-414f-ab7f-bbd82c568689"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.864362 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-kube-api-access-f2rdg" (OuterVolumeSpecName: "kube-api-access-f2rdg") pod "8aadd3b6-2c6f-414f-ab7f-bbd82c568689" (UID: "8aadd3b6-2c6f-414f-ab7f-bbd82c568689"). InnerVolumeSpecName "kube-api-access-f2rdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.917515 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8aadd3b6-2c6f-414f-ab7f-bbd82c568689" (UID: "8aadd3b6-2c6f-414f-ab7f-bbd82c568689"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.948065 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.948098 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2rdg\" (UniqueName: \"kubernetes.io/projected/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-kube-api-access-f2rdg\") on node \"crc\" DevicePath \"\"" Dec 09 05:56:16 crc kubenswrapper[4766]: I1209 05:56:16.948108 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aadd3b6-2c6f-414f-ab7f-bbd82c568689-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:56:17 crc kubenswrapper[4766]: I1209 05:56:17.699084 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pt6rh" event={"ID":"8aadd3b6-2c6f-414f-ab7f-bbd82c568689","Type":"ContainerDied","Data":"d43a8e29a16cc718f389ee21f5d0068b5f6576e1112bf9c0d4fd480000dd2d4c"} Dec 09 05:56:17 crc kubenswrapper[4766]: I1209 05:56:17.699461 4766 scope.go:117] "RemoveContainer" containerID="5f99dbbc13ee0489976e6eee73f09e7f039e218f91aee1be5a7940a48887d3fa" Dec 09 05:56:17 crc kubenswrapper[4766]: I1209 05:56:17.699649 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pt6rh" Dec 09 05:56:17 crc kubenswrapper[4766]: I1209 05:56:17.734543 4766 scope.go:117] "RemoveContainer" containerID="8b65800fc7209b610b22493e9a516d034b0fbcb7eeb84e882ee346fc3a20ed51" Dec 09 05:56:17 crc kubenswrapper[4766]: I1209 05:56:17.744337 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pt6rh"] Dec 09 05:56:17 crc kubenswrapper[4766]: I1209 05:56:17.763431 4766 scope.go:117] "RemoveContainer" containerID="a3adaba59234c56d59bd1503ffaee5a99a9c6fbfd741007f9f95851f40862df5" Dec 09 05:56:17 crc kubenswrapper[4766]: I1209 05:56:17.773255 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pt6rh"] Dec 09 05:56:18 crc kubenswrapper[4766]: I1209 05:56:18.862766 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" path="/var/lib/kubelet/pods/8aadd3b6-2c6f-414f-ab7f-bbd82c568689/volumes" Dec 09 05:56:37 crc kubenswrapper[4766]: I1209 05:56:37.316785 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 05:56:37 crc kubenswrapper[4766]: I1209 05:56:37.317319 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 05:56:37 crc kubenswrapper[4766]: I1209 05:56:37.317370 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 05:56:37 crc kubenswrapper[4766]: I1209 05:56:37.318264 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 05:56:37 crc kubenswrapper[4766]: I1209 05:56:37.318329 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" gracePeriod=600 Dec 09 05:56:37 crc kubenswrapper[4766]: E1209 05:56:37.454203 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:56:37 crc kubenswrapper[4766]: I1209 05:56:37.957673 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" exitCode=0 Dec 09 05:56:37 crc kubenswrapper[4766]: I1209 05:56:37.957739 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86"} Dec 09 05:56:37 crc kubenswrapper[4766]: I1209 05:56:37.957822 4766 scope.go:117] "RemoveContainer" containerID="a91ffe188e8db06586ea61e7c1b2afd36c11fb9b8087b7474593627431c68daf" Dec 09 05:56:37 crc kubenswrapper[4766]: I1209 05:56:37.958811 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:56:37 crc kubenswrapper[4766]: E1209 05:56:37.959246 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:56:48 crc kubenswrapper[4766]: I1209 05:56:48.855908 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:56:48 crc kubenswrapper[4766]: E1209 05:56:48.856710 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:57:01 crc kubenswrapper[4766]: I1209 05:57:01.839659 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:57:01 crc kubenswrapper[4766]: E1209 05:57:01.840486 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:57:13 crc kubenswrapper[4766]: I1209 05:57:13.839195 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:57:13 crc kubenswrapper[4766]: E1209 05:57:13.839957 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:57:24 crc kubenswrapper[4766]: I1209 05:57:24.839707 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:57:24 crc kubenswrapper[4766]: E1209 05:57:24.840927 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:57:37 crc kubenswrapper[4766]: I1209 05:57:37.839857 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:57:37 crc kubenswrapper[4766]: E1209 05:57:37.840775 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:57:49 crc kubenswrapper[4766]: I1209 05:57:49.840085 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:57:49 crc kubenswrapper[4766]: E1209 05:57:49.842332 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.576602 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p6w8g"] Dec 09 05:57:58 crc kubenswrapper[4766]: E1209 05:57:58.577691 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerName="registry-server" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.577707 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerName="registry-server" Dec 09 05:57:58 crc kubenswrapper[4766]: E1209 05:57:58.577749 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerName="extract-utilities" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.577758 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerName="extract-utilities" Dec 09 05:57:58 crc kubenswrapper[4766]: E1209 05:57:58.577778 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerName="extract-content" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.577789 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerName="extract-content" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.578093 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aadd3b6-2c6f-414f-ab7f-bbd82c568689" containerName="registry-server" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.583608 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.597086 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6w8g"] Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.686997 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwd2r\" (UniqueName: \"kubernetes.io/projected/ad29e02b-7d47-4705-995a-8cae653d5a09-kube-api-access-kwd2r\") pod \"redhat-marketplace-p6w8g\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.687674 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-catalog-content\") pod \"redhat-marketplace-p6w8g\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.687931 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-utilities\") pod \"redhat-marketplace-p6w8g\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.790321 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-utilities\") pod \"redhat-marketplace-p6w8g\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.790447 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwd2r\" (UniqueName: \"kubernetes.io/projected/ad29e02b-7d47-4705-995a-8cae653d5a09-kube-api-access-kwd2r\") pod \"redhat-marketplace-p6w8g\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.790594 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-catalog-content\") pod \"redhat-marketplace-p6w8g\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.790879 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-utilities\") pod \"redhat-marketplace-p6w8g\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.790983 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-catalog-content\") pod \"redhat-marketplace-p6w8g\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.816736 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwd2r\" (UniqueName: \"kubernetes.io/projected/ad29e02b-7d47-4705-995a-8cae653d5a09-kube-api-access-kwd2r\") pod \"redhat-marketplace-p6w8g\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:58 crc kubenswrapper[4766]: I1209 05:57:58.909312 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:57:59 crc kubenswrapper[4766]: I1209 05:57:59.427490 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6w8g"] Dec 09 05:57:59 crc kubenswrapper[4766]: I1209 05:57:59.950734 4766 generic.go:334] "Generic (PLEG): container finished" podID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerID="58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76" exitCode=0 Dec 09 05:57:59 crc kubenswrapper[4766]: I1209 05:57:59.950840 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6w8g" event={"ID":"ad29e02b-7d47-4705-995a-8cae653d5a09","Type":"ContainerDied","Data":"58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76"} Dec 09 05:57:59 crc kubenswrapper[4766]: I1209 05:57:59.951251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6w8g" event={"ID":"ad29e02b-7d47-4705-995a-8cae653d5a09","Type":"ContainerStarted","Data":"c778789b79e00414835d4e2017845ba63f7327665c1d4cb1530936153b03b8e3"} Dec 09 05:57:59 crc kubenswrapper[4766]: I1209 05:57:59.953998 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 05:58:00 crc kubenswrapper[4766]: I1209 05:58:00.840910 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:58:00 crc kubenswrapper[4766]: E1209 05:58:00.841209 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:58:02 crc kubenswrapper[4766]: I1209 05:58:02.002684 4766 generic.go:334] "Generic (PLEG): container finished" podID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerID="e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df" exitCode=0 Dec 09 05:58:02 crc kubenswrapper[4766]: I1209 05:58:02.002840 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6w8g" event={"ID":"ad29e02b-7d47-4705-995a-8cae653d5a09","Type":"ContainerDied","Data":"e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df"} Dec 09 05:58:03 crc kubenswrapper[4766]: I1209 05:58:03.017176 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6w8g" event={"ID":"ad29e02b-7d47-4705-995a-8cae653d5a09","Type":"ContainerStarted","Data":"334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903"} Dec 09 05:58:03 crc kubenswrapper[4766]: I1209 05:58:03.044715 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p6w8g" podStartSLOduration=2.5565036230000002 podStartE2EDuration="5.044689961s" podCreationTimestamp="2025-12-09 05:57:58 +0000 UTC" firstStartedPulling="2025-12-09 05:57:59.953764636 +0000 UTC m=+9961.663070062" lastFinishedPulling="2025-12-09 05:58:02.441950974 +0000 UTC m=+9964.151256400" observedRunningTime="2025-12-09 05:58:03.038789952 +0000 UTC m=+9964.748095398" watchObservedRunningTime="2025-12-09 05:58:03.044689961 +0000 UTC m=+9964.753995427" Dec 09 05:58:08 crc kubenswrapper[4766]: I1209 05:58:08.909712 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:58:08 crc kubenswrapper[4766]: I1209 05:58:08.910363 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:58:08 crc kubenswrapper[4766]: I1209 05:58:08.953464 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:58:09 crc kubenswrapper[4766]: I1209 05:58:09.120331 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:58:09 crc kubenswrapper[4766]: I1209 05:58:09.208542 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6w8g"] Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.092514 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p6w8g" podUID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerName="registry-server" containerID="cri-o://334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903" gracePeriod=2 Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.679052 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.772362 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-utilities\") pod \"ad29e02b-7d47-4705-995a-8cae653d5a09\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.772494 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwd2r\" (UniqueName: \"kubernetes.io/projected/ad29e02b-7d47-4705-995a-8cae653d5a09-kube-api-access-kwd2r\") pod \"ad29e02b-7d47-4705-995a-8cae653d5a09\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.772628 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-catalog-content\") pod \"ad29e02b-7d47-4705-995a-8cae653d5a09\" (UID: \"ad29e02b-7d47-4705-995a-8cae653d5a09\") " Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.773961 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-utilities" (OuterVolumeSpecName: "utilities") pod "ad29e02b-7d47-4705-995a-8cae653d5a09" (UID: "ad29e02b-7d47-4705-995a-8cae653d5a09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.787657 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad29e02b-7d47-4705-995a-8cae653d5a09-kube-api-access-kwd2r" (OuterVolumeSpecName: "kube-api-access-kwd2r") pod "ad29e02b-7d47-4705-995a-8cae653d5a09" (UID: "ad29e02b-7d47-4705-995a-8cae653d5a09"). InnerVolumeSpecName "kube-api-access-kwd2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.793076 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad29e02b-7d47-4705-995a-8cae653d5a09" (UID: "ad29e02b-7d47-4705-995a-8cae653d5a09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.875240 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.875281 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwd2r\" (UniqueName: \"kubernetes.io/projected/ad29e02b-7d47-4705-995a-8cae653d5a09-kube-api-access-kwd2r\") on node \"crc\" DevicePath \"\"" Dec 09 05:58:11 crc kubenswrapper[4766]: I1209 05:58:11.875293 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad29e02b-7d47-4705-995a-8cae653d5a09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.108099 4766 generic.go:334] "Generic (PLEG): container finished" podID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerID="334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903" exitCode=0 Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.108141 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6w8g" event={"ID":"ad29e02b-7d47-4705-995a-8cae653d5a09","Type":"ContainerDied","Data":"334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903"} Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.108153 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6w8g" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.108169 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6w8g" event={"ID":"ad29e02b-7d47-4705-995a-8cae653d5a09","Type":"ContainerDied","Data":"c778789b79e00414835d4e2017845ba63f7327665c1d4cb1530936153b03b8e3"} Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.108186 4766 scope.go:117] "RemoveContainer" containerID="334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.142023 4766 scope.go:117] "RemoveContainer" containerID="e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.142358 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6w8g"] Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.153564 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6w8g"] Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.163129 4766 scope.go:117] "RemoveContainer" containerID="58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.204899 4766 scope.go:117] "RemoveContainer" containerID="334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903" Dec 09 05:58:12 crc kubenswrapper[4766]: E1209 05:58:12.205535 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903\": container with ID starting with 334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903 not found: ID does not exist" containerID="334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.205586 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903"} err="failed to get container status \"334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903\": rpc error: code = NotFound desc = could not find container \"334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903\": container with ID starting with 334f79ba8fd2eb7cb45ae7d7bf0e2126ccb6b7674599250bf6ccf69172805903 not found: ID does not exist" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.205615 4766 scope.go:117] "RemoveContainer" containerID="e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df" Dec 09 05:58:12 crc kubenswrapper[4766]: E1209 05:58:12.205998 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df\": container with ID starting with e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df not found: ID does not exist" containerID="e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.206027 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df"} err="failed to get container status \"e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df\": rpc error: code = NotFound desc = could not find container \"e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df\": container with ID starting with e68fd3750cffc96a1c370a98842e8cec92a0703274f3bcca5b1290e974eff2df not found: ID does not exist" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.206048 4766 scope.go:117] "RemoveContainer" containerID="58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76" Dec 09 05:58:12 crc kubenswrapper[4766]: E1209 05:58:12.206361 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76\": container with ID starting with 58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76 not found: ID does not exist" containerID="58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.206435 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76"} err="failed to get container status \"58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76\": rpc error: code = NotFound desc = could not find container \"58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76\": container with ID starting with 58ad72b201a23abe35c0985fc24767b60f416c03e7fdf0b251eb81a56dd90d76 not found: ID does not exist" Dec 09 05:58:12 crc kubenswrapper[4766]: I1209 05:58:12.856766 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad29e02b-7d47-4705-995a-8cae653d5a09" path="/var/lib/kubelet/pods/ad29e02b-7d47-4705-995a-8cae653d5a09/volumes" Dec 09 05:58:13 crc kubenswrapper[4766]: I1209 05:58:13.839029 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:58:13 crc kubenswrapper[4766]: E1209 05:58:13.839587 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.253361 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bz7gl"] Dec 09 05:58:18 crc kubenswrapper[4766]: E1209 05:58:18.254352 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerName="extract-content" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.254373 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerName="extract-content" Dec 09 05:58:18 crc kubenswrapper[4766]: E1209 05:58:18.254419 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerName="extract-utilities" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.254431 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerName="extract-utilities" Dec 09 05:58:18 crc kubenswrapper[4766]: E1209 05:58:18.254456 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerName="registry-server" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.254464 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerName="registry-server" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.254777 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad29e02b-7d47-4705-995a-8cae653d5a09" containerName="registry-server" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.256929 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.283355 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bz7gl"] Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.428076 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-utilities\") pod \"redhat-operators-bz7gl\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.428137 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-catalog-content\") pod \"redhat-operators-bz7gl\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.428303 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzk5m\" (UniqueName: \"kubernetes.io/projected/de27651f-3a83-41e9-ba5a-6bccbdd6e966-kube-api-access-wzk5m\") pod \"redhat-operators-bz7gl\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.530355 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-utilities\") pod \"redhat-operators-bz7gl\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.530411 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-catalog-content\") pod \"redhat-operators-bz7gl\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.530522 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzk5m\" (UniqueName: \"kubernetes.io/projected/de27651f-3a83-41e9-ba5a-6bccbdd6e966-kube-api-access-wzk5m\") pod \"redhat-operators-bz7gl\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.530834 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-utilities\") pod \"redhat-operators-bz7gl\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.530906 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-catalog-content\") pod \"redhat-operators-bz7gl\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.759242 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzk5m\" (UniqueName: \"kubernetes.io/projected/de27651f-3a83-41e9-ba5a-6bccbdd6e966-kube-api-access-wzk5m\") pod \"redhat-operators-bz7gl\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:18 crc kubenswrapper[4766]: I1209 05:58:18.875725 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:19 crc kubenswrapper[4766]: I1209 05:58:19.377603 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bz7gl"] Dec 09 05:58:20 crc kubenswrapper[4766]: I1209 05:58:20.201745 4766 generic.go:334] "Generic (PLEG): container finished" podID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerID="a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe" exitCode=0 Dec 09 05:58:20 crc kubenswrapper[4766]: I1209 05:58:20.201862 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bz7gl" event={"ID":"de27651f-3a83-41e9-ba5a-6bccbdd6e966","Type":"ContainerDied","Data":"a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe"} Dec 09 05:58:20 crc kubenswrapper[4766]: I1209 05:58:20.202120 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bz7gl" event={"ID":"de27651f-3a83-41e9-ba5a-6bccbdd6e966","Type":"ContainerStarted","Data":"650a1dea4fad810082e0cf9f35a04856a23412da63ce907522bdfe5201fd167b"} Dec 09 05:58:22 crc kubenswrapper[4766]: I1209 05:58:22.228378 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bz7gl" event={"ID":"de27651f-3a83-41e9-ba5a-6bccbdd6e966","Type":"ContainerStarted","Data":"ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7"} Dec 09 05:58:24 crc kubenswrapper[4766]: I1209 05:58:24.251572 4766 generic.go:334] "Generic (PLEG): container finished" podID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerID="ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7" exitCode=0 Dec 09 05:58:24 crc kubenswrapper[4766]: I1209 05:58:24.251648 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bz7gl" event={"ID":"de27651f-3a83-41e9-ba5a-6bccbdd6e966","Type":"ContainerDied","Data":"ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7"} Dec 09 05:58:25 crc kubenswrapper[4766]: I1209 05:58:25.264866 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bz7gl" event={"ID":"de27651f-3a83-41e9-ba5a-6bccbdd6e966","Type":"ContainerStarted","Data":"ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237"} Dec 09 05:58:25 crc kubenswrapper[4766]: I1209 05:58:25.293027 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bz7gl" podStartSLOduration=2.8262860290000003 podStartE2EDuration="7.293004227s" podCreationTimestamp="2025-12-09 05:58:18 +0000 UTC" firstStartedPulling="2025-12-09 05:58:20.204330972 +0000 UTC m=+9981.913636408" lastFinishedPulling="2025-12-09 05:58:24.67104918 +0000 UTC m=+9986.380354606" observedRunningTime="2025-12-09 05:58:25.286902712 +0000 UTC m=+9986.996208148" watchObservedRunningTime="2025-12-09 05:58:25.293004227 +0000 UTC m=+9987.002309653" Dec 09 05:58:26 crc kubenswrapper[4766]: I1209 05:58:26.839666 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:58:26 crc kubenswrapper[4766]: E1209 05:58:26.840110 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:58:28 crc kubenswrapper[4766]: I1209 05:58:28.876120 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:28 crc kubenswrapper[4766]: I1209 05:58:28.876393 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:29 crc kubenswrapper[4766]: I1209 05:58:29.003481 4766 scope.go:117] "RemoveContainer" containerID="1b3a0f2428ca875a234b418c6d8a7724d25e5dba181181560930914d53818415" Dec 09 05:58:29 crc kubenswrapper[4766]: I1209 05:58:29.976922 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bz7gl" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerName="registry-server" probeResult="failure" output=< Dec 09 05:58:29 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 05:58:29 crc kubenswrapper[4766]: > Dec 09 05:58:38 crc kubenswrapper[4766]: I1209 05:58:38.848946 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:58:38 crc kubenswrapper[4766]: E1209 05:58:38.849852 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:58:38 crc kubenswrapper[4766]: I1209 05:58:38.926382 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:38 crc kubenswrapper[4766]: I1209 05:58:38.974559 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:39 crc kubenswrapper[4766]: I1209 05:58:39.168074 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bz7gl"] Dec 09 05:58:40 crc kubenswrapper[4766]: I1209 05:58:40.445645 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bz7gl" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerName="registry-server" containerID="cri-o://ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237" gracePeriod=2 Dec 09 05:58:40 crc kubenswrapper[4766]: I1209 05:58:40.983080 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.120765 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-catalog-content\") pod \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.121104 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-utilities\") pod \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.121151 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzk5m\" (UniqueName: \"kubernetes.io/projected/de27651f-3a83-41e9-ba5a-6bccbdd6e966-kube-api-access-wzk5m\") pod \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\" (UID: \"de27651f-3a83-41e9-ba5a-6bccbdd6e966\") " Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.122029 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-utilities" (OuterVolumeSpecName: "utilities") pod "de27651f-3a83-41e9-ba5a-6bccbdd6e966" (UID: "de27651f-3a83-41e9-ba5a-6bccbdd6e966"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.126823 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de27651f-3a83-41e9-ba5a-6bccbdd6e966-kube-api-access-wzk5m" (OuterVolumeSpecName: "kube-api-access-wzk5m") pod "de27651f-3a83-41e9-ba5a-6bccbdd6e966" (UID: "de27651f-3a83-41e9-ba5a-6bccbdd6e966"). InnerVolumeSpecName "kube-api-access-wzk5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.223946 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.223983 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzk5m\" (UniqueName: \"kubernetes.io/projected/de27651f-3a83-41e9-ba5a-6bccbdd6e966-kube-api-access-wzk5m\") on node \"crc\" DevicePath \"\"" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.225438 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de27651f-3a83-41e9-ba5a-6bccbdd6e966" (UID: "de27651f-3a83-41e9-ba5a-6bccbdd6e966"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.326610 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de27651f-3a83-41e9-ba5a-6bccbdd6e966-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.457420 4766 generic.go:334] "Generic (PLEG): container finished" podID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerID="ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237" exitCode=0 Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.457460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bz7gl" event={"ID":"de27651f-3a83-41e9-ba5a-6bccbdd6e966","Type":"ContainerDied","Data":"ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237"} Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.457495 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bz7gl" event={"ID":"de27651f-3a83-41e9-ba5a-6bccbdd6e966","Type":"ContainerDied","Data":"650a1dea4fad810082e0cf9f35a04856a23412da63ce907522bdfe5201fd167b"} Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.457511 4766 scope.go:117] "RemoveContainer" containerID="ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.457561 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bz7gl" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.493100 4766 scope.go:117] "RemoveContainer" containerID="ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.517143 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bz7gl"] Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.522737 4766 scope.go:117] "RemoveContainer" containerID="a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.526564 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bz7gl"] Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.574113 4766 scope.go:117] "RemoveContainer" containerID="ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237" Dec 09 05:58:41 crc kubenswrapper[4766]: E1209 05:58:41.574649 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237\": container with ID starting with ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237 not found: ID does not exist" containerID="ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.574730 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237"} err="failed to get container status \"ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237\": rpc error: code = NotFound desc = could not find container \"ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237\": container with ID starting with ce7c488a44afbbda6dbb63a992da52bf9ca19563cbf72150e551079c7c46f237 not found: ID does not exist" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.574779 4766 scope.go:117] "RemoveContainer" containerID="ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7" Dec 09 05:58:41 crc kubenswrapper[4766]: E1209 05:58:41.575337 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7\": container with ID starting with ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7 not found: ID does not exist" containerID="ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.575377 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7"} err="failed to get container status \"ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7\": rpc error: code = NotFound desc = could not find container \"ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7\": container with ID starting with ce3c82c49f257657621d962bb6e386c37954b765265079d7c617432b96b155e7 not found: ID does not exist" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.575417 4766 scope.go:117] "RemoveContainer" containerID="a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe" Dec 09 05:58:41 crc kubenswrapper[4766]: E1209 05:58:41.575723 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe\": container with ID starting with a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe not found: ID does not exist" containerID="a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe" Dec 09 05:58:41 crc kubenswrapper[4766]: I1209 05:58:41.575781 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe"} err="failed to get container status \"a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe\": rpc error: code = NotFound desc = could not find container \"a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe\": container with ID starting with a6fa6df90d3fcc6372ea49bb3e23a72acc06b73f87fbee76eadafd8923d364fe not found: ID does not exist" Dec 09 05:58:42 crc kubenswrapper[4766]: I1209 05:58:42.855746 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" path="/var/lib/kubelet/pods/de27651f-3a83-41e9-ba5a-6bccbdd6e966/volumes" Dec 09 05:58:51 crc kubenswrapper[4766]: I1209 05:58:51.839377 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:58:51 crc kubenswrapper[4766]: E1209 05:58:51.840357 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:59:02 crc kubenswrapper[4766]: I1209 05:59:02.839734 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:59:02 crc kubenswrapper[4766]: E1209 05:59:02.840934 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.186610 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jjj7p"] Dec 09 05:59:03 crc kubenswrapper[4766]: E1209 05:59:03.187144 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerName="registry-server" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.187172 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerName="registry-server" Dec 09 05:59:03 crc kubenswrapper[4766]: E1209 05:59:03.187205 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerName="extract-content" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.187480 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerName="extract-content" Dec 09 05:59:03 crc kubenswrapper[4766]: E1209 05:59:03.187528 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerName="extract-utilities" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.187538 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerName="extract-utilities" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.187795 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="de27651f-3a83-41e9-ba5a-6bccbdd6e966" containerName="registry-server" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.190462 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.202678 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjj7p"] Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.337990 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztfqb\" (UniqueName: \"kubernetes.io/projected/ecad06b4-2c53-4419-874c-9d87c1b5680c-kube-api-access-ztfqb\") pod \"certified-operators-jjj7p\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.338164 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-utilities\") pod \"certified-operators-jjj7p\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.338404 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-catalog-content\") pod \"certified-operators-jjj7p\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.440810 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztfqb\" (UniqueName: \"kubernetes.io/projected/ecad06b4-2c53-4419-874c-9d87c1b5680c-kube-api-access-ztfqb\") pod \"certified-operators-jjj7p\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.440908 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-utilities\") pod \"certified-operators-jjj7p\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.440972 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-catalog-content\") pod \"certified-operators-jjj7p\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.441435 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-catalog-content\") pod \"certified-operators-jjj7p\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.441498 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-utilities\") pod \"certified-operators-jjj7p\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.462714 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztfqb\" (UniqueName: \"kubernetes.io/projected/ecad06b4-2c53-4419-874c-9d87c1b5680c-kube-api-access-ztfqb\") pod \"certified-operators-jjj7p\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:03 crc kubenswrapper[4766]: I1209 05:59:03.522906 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:04 crc kubenswrapper[4766]: I1209 05:59:04.024041 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjj7p"] Dec 09 05:59:04 crc kubenswrapper[4766]: I1209 05:59:04.717862 4766 generic.go:334] "Generic (PLEG): container finished" podID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerID="2ee9683bd1859985106ce158b7a09d7e2c57d6548218b980c3b9ec1da3fdc6ea" exitCode=0 Dec 09 05:59:04 crc kubenswrapper[4766]: I1209 05:59:04.717947 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjj7p" event={"ID":"ecad06b4-2c53-4419-874c-9d87c1b5680c","Type":"ContainerDied","Data":"2ee9683bd1859985106ce158b7a09d7e2c57d6548218b980c3b9ec1da3fdc6ea"} Dec 09 05:59:04 crc kubenswrapper[4766]: I1209 05:59:04.718279 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjj7p" event={"ID":"ecad06b4-2c53-4419-874c-9d87c1b5680c","Type":"ContainerStarted","Data":"e9a44a1188a99a15114ad3ad1030516c1089159b3cb869d557fed71eedc143f0"} Dec 09 05:59:05 crc kubenswrapper[4766]: I1209 05:59:05.730785 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjj7p" event={"ID":"ecad06b4-2c53-4419-874c-9d87c1b5680c","Type":"ContainerStarted","Data":"2477cd55fc1efa355d74074907d34a3ef8168b8438ef390100c34e42d72587d8"} Dec 09 05:59:06 crc kubenswrapper[4766]: I1209 05:59:06.745760 4766 generic.go:334] "Generic (PLEG): container finished" podID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerID="2477cd55fc1efa355d74074907d34a3ef8168b8438ef390100c34e42d72587d8" exitCode=0 Dec 09 05:59:06 crc kubenswrapper[4766]: I1209 05:59:06.745835 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjj7p" event={"ID":"ecad06b4-2c53-4419-874c-9d87c1b5680c","Type":"ContainerDied","Data":"2477cd55fc1efa355d74074907d34a3ef8168b8438ef390100c34e42d72587d8"} Dec 09 05:59:07 crc kubenswrapper[4766]: I1209 05:59:07.759133 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjj7p" event={"ID":"ecad06b4-2c53-4419-874c-9d87c1b5680c","Type":"ContainerStarted","Data":"f9902c7f29a6fdb114022ccdcbd4afdce7c61e7d024eb86edc36e31d3343ff48"} Dec 09 05:59:07 crc kubenswrapper[4766]: I1209 05:59:07.799201 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jjj7p" podStartSLOduration=2.361039402 podStartE2EDuration="4.799181097s" podCreationTimestamp="2025-12-09 05:59:03 +0000 UTC" firstStartedPulling="2025-12-09 05:59:04.720541643 +0000 UTC m=+10026.429847079" lastFinishedPulling="2025-12-09 05:59:07.158683348 +0000 UTC m=+10028.867988774" observedRunningTime="2025-12-09 05:59:07.78897081 +0000 UTC m=+10029.498276246" watchObservedRunningTime="2025-12-09 05:59:07.799181097 +0000 UTC m=+10029.508486523" Dec 09 05:59:13 crc kubenswrapper[4766]: I1209 05:59:13.523366 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:13 crc kubenswrapper[4766]: I1209 05:59:13.524052 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:13 crc kubenswrapper[4766]: I1209 05:59:13.572328 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:13 crc kubenswrapper[4766]: I1209 05:59:13.839830 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:59:13 crc kubenswrapper[4766]: E1209 05:59:13.840125 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:59:13 crc kubenswrapper[4766]: I1209 05:59:13.867345 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:13 crc kubenswrapper[4766]: I1209 05:59:13.973609 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjj7p"] Dec 09 05:59:15 crc kubenswrapper[4766]: I1209 05:59:15.842321 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jjj7p" podUID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerName="registry-server" containerID="cri-o://f9902c7f29a6fdb114022ccdcbd4afdce7c61e7d024eb86edc36e31d3343ff48" gracePeriod=2 Dec 09 05:59:16 crc kubenswrapper[4766]: I1209 05:59:16.886223 4766 generic.go:334] "Generic (PLEG): container finished" podID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerID="f9902c7f29a6fdb114022ccdcbd4afdce7c61e7d024eb86edc36e31d3343ff48" exitCode=0 Dec 09 05:59:16 crc kubenswrapper[4766]: I1209 05:59:16.886436 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjj7p" event={"ID":"ecad06b4-2c53-4419-874c-9d87c1b5680c","Type":"ContainerDied","Data":"f9902c7f29a6fdb114022ccdcbd4afdce7c61e7d024eb86edc36e31d3343ff48"} Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.502792 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.669773 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztfqb\" (UniqueName: \"kubernetes.io/projected/ecad06b4-2c53-4419-874c-9d87c1b5680c-kube-api-access-ztfqb\") pod \"ecad06b4-2c53-4419-874c-9d87c1b5680c\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.670006 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-catalog-content\") pod \"ecad06b4-2c53-4419-874c-9d87c1b5680c\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.670086 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-utilities\") pod \"ecad06b4-2c53-4419-874c-9d87c1b5680c\" (UID: \"ecad06b4-2c53-4419-874c-9d87c1b5680c\") " Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.671379 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-utilities" (OuterVolumeSpecName: "utilities") pod "ecad06b4-2c53-4419-874c-9d87c1b5680c" (UID: "ecad06b4-2c53-4419-874c-9d87c1b5680c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.725948 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecad06b4-2c53-4419-874c-9d87c1b5680c" (UID: "ecad06b4-2c53-4419-874c-9d87c1b5680c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.774130 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.774695 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecad06b4-2c53-4419-874c-9d87c1b5680c-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.899693 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjj7p" event={"ID":"ecad06b4-2c53-4419-874c-9d87c1b5680c","Type":"ContainerDied","Data":"e9a44a1188a99a15114ad3ad1030516c1089159b3cb869d557fed71eedc143f0"} Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.901029 4766 scope.go:117] "RemoveContainer" containerID="f9902c7f29a6fdb114022ccdcbd4afdce7c61e7d024eb86edc36e31d3343ff48" Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.900045 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjj7p" Dec 09 05:59:17 crc kubenswrapper[4766]: I1209 05:59:17.933926 4766 scope.go:117] "RemoveContainer" containerID="2477cd55fc1efa355d74074907d34a3ef8168b8438ef390100c34e42d72587d8" Dec 09 05:59:18 crc kubenswrapper[4766]: I1209 05:59:18.161417 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecad06b4-2c53-4419-874c-9d87c1b5680c-kube-api-access-ztfqb" (OuterVolumeSpecName: "kube-api-access-ztfqb") pod "ecad06b4-2c53-4419-874c-9d87c1b5680c" (UID: "ecad06b4-2c53-4419-874c-9d87c1b5680c"). InnerVolumeSpecName "kube-api-access-ztfqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 05:59:18 crc kubenswrapper[4766]: I1209 05:59:18.182795 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztfqb\" (UniqueName: \"kubernetes.io/projected/ecad06b4-2c53-4419-874c-9d87c1b5680c-kube-api-access-ztfqb\") on node \"crc\" DevicePath \"\"" Dec 09 05:59:18 crc kubenswrapper[4766]: I1209 05:59:18.216629 4766 scope.go:117] "RemoveContainer" containerID="2ee9683bd1859985106ce158b7a09d7e2c57d6548218b980c3b9ec1da3fdc6ea" Dec 09 05:59:18 crc kubenswrapper[4766]: I1209 05:59:18.340664 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjj7p"] Dec 09 05:59:18 crc kubenswrapper[4766]: I1209 05:59:18.351474 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jjj7p"] Dec 09 05:59:18 crc kubenswrapper[4766]: I1209 05:59:18.852594 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecad06b4-2c53-4419-874c-9d87c1b5680c" path="/var/lib/kubelet/pods/ecad06b4-2c53-4419-874c-9d87c1b5680c/volumes" Dec 09 05:59:25 crc kubenswrapper[4766]: I1209 05:59:25.839277 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:59:25 crc kubenswrapper[4766]: E1209 05:59:25.840141 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:59:37 crc kubenswrapper[4766]: I1209 05:59:37.839607 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:59:37 crc kubenswrapper[4766]: E1209 05:59:37.840654 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 05:59:49 crc kubenswrapper[4766]: I1209 05:59:49.839613 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 05:59:49 crc kubenswrapper[4766]: E1209 05:59:49.840511 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.165564 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd"] Dec 09 06:00:00 crc kubenswrapper[4766]: E1209 06:00:00.167971 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerName="registry-server" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.168077 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerName="registry-server" Dec 09 06:00:00 crc kubenswrapper[4766]: E1209 06:00:00.168170 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerName="extract-content" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.168257 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerName="extract-content" Dec 09 06:00:00 crc kubenswrapper[4766]: E1209 06:00:00.168328 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerName="extract-utilities" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.168406 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerName="extract-utilities" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.168702 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecad06b4-2c53-4419-874c-9d87c1b5680c" containerName="registry-server" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.169662 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.171737 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.172722 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.176669 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd"] Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.324435 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f448ba7-d8ca-4714-8e52-7b87f5fda559-config-volume\") pod \"collect-profiles-29421000-g6fbd\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.324818 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqgkj\" (UniqueName: \"kubernetes.io/projected/6f448ba7-d8ca-4714-8e52-7b87f5fda559-kube-api-access-rqgkj\") pod \"collect-profiles-29421000-g6fbd\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.324932 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f448ba7-d8ca-4714-8e52-7b87f5fda559-secret-volume\") pod \"collect-profiles-29421000-g6fbd\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.427265 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f448ba7-d8ca-4714-8e52-7b87f5fda559-secret-volume\") pod \"collect-profiles-29421000-g6fbd\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.427504 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f448ba7-d8ca-4714-8e52-7b87f5fda559-config-volume\") pod \"collect-profiles-29421000-g6fbd\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.427532 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqgkj\" (UniqueName: \"kubernetes.io/projected/6f448ba7-d8ca-4714-8e52-7b87f5fda559-kube-api-access-rqgkj\") pod \"collect-profiles-29421000-g6fbd\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.428361 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f448ba7-d8ca-4714-8e52-7b87f5fda559-config-volume\") pod \"collect-profiles-29421000-g6fbd\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.438015 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f448ba7-d8ca-4714-8e52-7b87f5fda559-secret-volume\") pod \"collect-profiles-29421000-g6fbd\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.449369 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqgkj\" (UniqueName: \"kubernetes.io/projected/6f448ba7-d8ca-4714-8e52-7b87f5fda559-kube-api-access-rqgkj\") pod \"collect-profiles-29421000-g6fbd\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.500985 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:00 crc kubenswrapper[4766]: I1209 06:00:00.990129 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd"] Dec 09 06:00:01 crc kubenswrapper[4766]: I1209 06:00:01.368948 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" event={"ID":"6f448ba7-d8ca-4714-8e52-7b87f5fda559","Type":"ContainerStarted","Data":"27dbffc27841e7aa42d094d7d21094b63231fcdc05566d57db364a5234cd312e"} Dec 09 06:00:01 crc kubenswrapper[4766]: I1209 06:00:01.369245 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" event={"ID":"6f448ba7-d8ca-4714-8e52-7b87f5fda559","Type":"ContainerStarted","Data":"30a9e785d459b64caa75ee45465cfd6fad69932394a1b44ada941edc314de18a"} Dec 09 06:00:02 crc kubenswrapper[4766]: I1209 06:00:02.383884 4766 generic.go:334] "Generic (PLEG): container finished" podID="6f448ba7-d8ca-4714-8e52-7b87f5fda559" containerID="27dbffc27841e7aa42d094d7d21094b63231fcdc05566d57db364a5234cd312e" exitCode=0 Dec 09 06:00:02 crc kubenswrapper[4766]: I1209 06:00:02.383968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" event={"ID":"6f448ba7-d8ca-4714-8e52-7b87f5fda559","Type":"ContainerDied","Data":"27dbffc27841e7aa42d094d7d21094b63231fcdc05566d57db364a5234cd312e"} Dec 09 06:00:02 crc kubenswrapper[4766]: I1209 06:00:02.839742 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:00:02 crc kubenswrapper[4766]: E1209 06:00:02.840180 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.734292 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.818770 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f448ba7-d8ca-4714-8e52-7b87f5fda559-config-volume\") pod \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.818847 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f448ba7-d8ca-4714-8e52-7b87f5fda559-secret-volume\") pod \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.819043 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqgkj\" (UniqueName: \"kubernetes.io/projected/6f448ba7-d8ca-4714-8e52-7b87f5fda559-kube-api-access-rqgkj\") pod \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\" (UID: \"6f448ba7-d8ca-4714-8e52-7b87f5fda559\") " Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.819898 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f448ba7-d8ca-4714-8e52-7b87f5fda559-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f448ba7-d8ca-4714-8e52-7b87f5fda559" (UID: "6f448ba7-d8ca-4714-8e52-7b87f5fda559"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.824458 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f448ba7-d8ca-4714-8e52-7b87f5fda559-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f448ba7-d8ca-4714-8e52-7b87f5fda559" (UID: "6f448ba7-d8ca-4714-8e52-7b87f5fda559"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.824486 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f448ba7-d8ca-4714-8e52-7b87f5fda559-kube-api-access-rqgkj" (OuterVolumeSpecName: "kube-api-access-rqgkj") pod "6f448ba7-d8ca-4714-8e52-7b87f5fda559" (UID: "6f448ba7-d8ca-4714-8e52-7b87f5fda559"). InnerVolumeSpecName "kube-api-access-rqgkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.921875 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f448ba7-d8ca-4714-8e52-7b87f5fda559-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.921936 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f448ba7-d8ca-4714-8e52-7b87f5fda559-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 06:00:03 crc kubenswrapper[4766]: I1209 06:00:03.921955 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqgkj\" (UniqueName: \"kubernetes.io/projected/6f448ba7-d8ca-4714-8e52-7b87f5fda559-kube-api-access-rqgkj\") on node \"crc\" DevicePath \"\"" Dec 09 06:00:04 crc kubenswrapper[4766]: I1209 06:00:04.404860 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" event={"ID":"6f448ba7-d8ca-4714-8e52-7b87f5fda559","Type":"ContainerDied","Data":"30a9e785d459b64caa75ee45465cfd6fad69932394a1b44ada941edc314de18a"} Dec 09 06:00:04 crc kubenswrapper[4766]: I1209 06:00:04.405400 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30a9e785d459b64caa75ee45465cfd6fad69932394a1b44ada941edc314de18a" Dec 09 06:00:04 crc kubenswrapper[4766]: I1209 06:00:04.404897 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421000-g6fbd" Dec 09 06:00:04 crc kubenswrapper[4766]: I1209 06:00:04.492501 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj"] Dec 09 06:00:04 crc kubenswrapper[4766]: I1209 06:00:04.506445 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420955-7cghj"] Dec 09 06:00:04 crc kubenswrapper[4766]: I1209 06:00:04.851283 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2" path="/var/lib/kubelet/pods/011c26e0-2d8f-4f36-a2be-cc81ec3ce2c2/volumes" Dec 09 06:00:13 crc kubenswrapper[4766]: I1209 06:00:13.839971 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:00:13 crc kubenswrapper[4766]: E1209 06:00:13.840859 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:00:26 crc kubenswrapper[4766]: I1209 06:00:26.839576 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:00:26 crc kubenswrapper[4766]: E1209 06:00:26.840250 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:00:29 crc kubenswrapper[4766]: I1209 06:00:29.137644 4766 scope.go:117] "RemoveContainer" containerID="0cf9e10bf47c7edcf552b6625f28e358f12bc7ca0859a85457e77de8d63b146a" Dec 09 06:00:38 crc kubenswrapper[4766]: I1209 06:00:38.847491 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:00:38 crc kubenswrapper[4766]: E1209 06:00:38.848875 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:00:50 crc kubenswrapper[4766]: I1209 06:00:50.839682 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:00:50 crc kubenswrapper[4766]: E1209 06:00:50.840599 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.165749 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29421001-5q7md"] Dec 09 06:01:00 crc kubenswrapper[4766]: E1209 06:01:00.166667 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f448ba7-d8ca-4714-8e52-7b87f5fda559" containerName="collect-profiles" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.166685 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f448ba7-d8ca-4714-8e52-7b87f5fda559" containerName="collect-profiles" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.167015 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f448ba7-d8ca-4714-8e52-7b87f5fda559" containerName="collect-profiles" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.167813 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.176426 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29421001-5q7md"] Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.216896 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-combined-ca-bundle\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.217182 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpzjw\" (UniqueName: \"kubernetes.io/projected/f6e655b6-f2b9-4866-9ff9-98f2d500b061-kube-api-access-mpzjw\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.217432 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-fernet-keys\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.217584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-config-data\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.320854 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-fernet-keys\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.321087 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-config-data\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.321206 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-combined-ca-bundle\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.321335 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpzjw\" (UniqueName: \"kubernetes.io/projected/f6e655b6-f2b9-4866-9ff9-98f2d500b061-kube-api-access-mpzjw\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.327584 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-combined-ca-bundle\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.328622 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-fernet-keys\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.332292 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-config-data\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.337590 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpzjw\" (UniqueName: \"kubernetes.io/projected/f6e655b6-f2b9-4866-9ff9-98f2d500b061-kube-api-access-mpzjw\") pod \"keystone-cron-29421001-5q7md\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.491953 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:00 crc kubenswrapper[4766]: I1209 06:01:00.989454 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29421001-5q7md"] Dec 09 06:01:01 crc kubenswrapper[4766]: I1209 06:01:01.041501 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29421001-5q7md" event={"ID":"f6e655b6-f2b9-4866-9ff9-98f2d500b061","Type":"ContainerStarted","Data":"2607d3ff4b20725b8a58f3899906613d21d904f7ce546a769fe8821881088bdf"} Dec 09 06:01:02 crc kubenswrapper[4766]: I1209 06:01:02.055365 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29421001-5q7md" event={"ID":"f6e655b6-f2b9-4866-9ff9-98f2d500b061","Type":"ContainerStarted","Data":"618d7f72c8f451826ca415f0e764ce7932bceb338ee2c3931a58b6be2973ffec"} Dec 09 06:01:02 crc kubenswrapper[4766]: I1209 06:01:02.088466 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29421001-5q7md" podStartSLOduration=2.088446322 podStartE2EDuration="2.088446322s" podCreationTimestamp="2025-12-09 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 06:01:02.08024498 +0000 UTC m=+10143.789550406" watchObservedRunningTime="2025-12-09 06:01:02.088446322 +0000 UTC m=+10143.797751748" Dec 09 06:01:04 crc kubenswrapper[4766]: I1209 06:01:04.081780 4766 generic.go:334] "Generic (PLEG): container finished" podID="f6e655b6-f2b9-4866-9ff9-98f2d500b061" containerID="618d7f72c8f451826ca415f0e764ce7932bceb338ee2c3931a58b6be2973ffec" exitCode=0 Dec 09 06:01:04 crc kubenswrapper[4766]: I1209 06:01:04.081924 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29421001-5q7md" event={"ID":"f6e655b6-f2b9-4866-9ff9-98f2d500b061","Type":"ContainerDied","Data":"618d7f72c8f451826ca415f0e764ce7932bceb338ee2c3931a58b6be2973ffec"} Dec 09 06:01:04 crc kubenswrapper[4766]: I1209 06:01:04.840148 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:01:04 crc kubenswrapper[4766]: E1209 06:01:04.840495 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:01:05 crc kubenswrapper[4766]: I1209 06:01:05.762477 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:05 crc kubenswrapper[4766]: I1209 06:01:05.850186 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-config-data\") pod \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " Dec 09 06:01:05 crc kubenswrapper[4766]: I1209 06:01:05.850473 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-combined-ca-bundle\") pod \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " Dec 09 06:01:05 crc kubenswrapper[4766]: I1209 06:01:05.850545 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpzjw\" (UniqueName: \"kubernetes.io/projected/f6e655b6-f2b9-4866-9ff9-98f2d500b061-kube-api-access-mpzjw\") pod \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " Dec 09 06:01:05 crc kubenswrapper[4766]: I1209 06:01:05.850662 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-fernet-keys\") pod \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\" (UID: \"f6e655b6-f2b9-4866-9ff9-98f2d500b061\") " Dec 09 06:01:05 crc kubenswrapper[4766]: I1209 06:01:05.860865 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f6e655b6-f2b9-4866-9ff9-98f2d500b061" (UID: "f6e655b6-f2b9-4866-9ff9-98f2d500b061"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 06:01:05 crc kubenswrapper[4766]: I1209 06:01:05.954307 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 09 06:01:06 crc kubenswrapper[4766]: I1209 06:01:06.056517 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e655b6-f2b9-4866-9ff9-98f2d500b061-kube-api-access-mpzjw" (OuterVolumeSpecName: "kube-api-access-mpzjw") pod "f6e655b6-f2b9-4866-9ff9-98f2d500b061" (UID: "f6e655b6-f2b9-4866-9ff9-98f2d500b061"). InnerVolumeSpecName "kube-api-access-mpzjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 06:01:06 crc kubenswrapper[4766]: I1209 06:01:06.059435 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6e655b6-f2b9-4866-9ff9-98f2d500b061" (UID: "f6e655b6-f2b9-4866-9ff9-98f2d500b061"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 06:01:06 crc kubenswrapper[4766]: I1209 06:01:06.083655 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-config-data" (OuterVolumeSpecName: "config-data") pod "f6e655b6-f2b9-4866-9ff9-98f2d500b061" (UID: "f6e655b6-f2b9-4866-9ff9-98f2d500b061"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 06:01:06 crc kubenswrapper[4766]: I1209 06:01:06.107010 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29421001-5q7md" event={"ID":"f6e655b6-f2b9-4866-9ff9-98f2d500b061","Type":"ContainerDied","Data":"2607d3ff4b20725b8a58f3899906613d21d904f7ce546a769fe8821881088bdf"} Dec 09 06:01:06 crc kubenswrapper[4766]: I1209 06:01:06.107057 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2607d3ff4b20725b8a58f3899906613d21d904f7ce546a769fe8821881088bdf" Dec 09 06:01:06 crc kubenswrapper[4766]: I1209 06:01:06.107065 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29421001-5q7md" Dec 09 06:01:06 crc kubenswrapper[4766]: I1209 06:01:06.160518 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-config-data\") on node \"crc\" DevicePath \"\"" Dec 09 06:01:06 crc kubenswrapper[4766]: I1209 06:01:06.160558 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6e655b6-f2b9-4866-9ff9-98f2d500b061-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 06:01:06 crc kubenswrapper[4766]: I1209 06:01:06.160572 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpzjw\" (UniqueName: \"kubernetes.io/projected/f6e655b6-f2b9-4866-9ff9-98f2d500b061-kube-api-access-mpzjw\") on node \"crc\" DevicePath \"\"" Dec 09 06:01:15 crc kubenswrapper[4766]: I1209 06:01:15.839582 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:01:15 crc kubenswrapper[4766]: E1209 06:01:15.840626 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:01:28 crc kubenswrapper[4766]: I1209 06:01:28.854945 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:01:28 crc kubenswrapper[4766]: E1209 06:01:28.855991 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:01:41 crc kubenswrapper[4766]: I1209 06:01:41.840137 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:01:42 crc kubenswrapper[4766]: I1209 06:01:42.516811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"4797219e0810bddab585efd706eef9cd1de0b04bb6ec690af775b468156933a0"} Dec 09 06:02:21 crc kubenswrapper[4766]: I1209 06:02:21.529499 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_e98929a1-109f-4cbc-91a8-929a852d163b/init-config-reloader/0.log" Dec 09 06:02:21 crc kubenswrapper[4766]: I1209 06:02:21.845613 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_e98929a1-109f-4cbc-91a8-929a852d163b/alertmanager/0.log" Dec 09 06:02:21 crc kubenswrapper[4766]: I1209 06:02:21.857761 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_e98929a1-109f-4cbc-91a8-929a852d163b/init-config-reloader/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.283066 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_280582e1-1357-4079-bf39-4328d162cbe8/aodh-evaluator/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.285064 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_e98929a1-109f-4cbc-91a8-929a852d163b/config-reloader/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.292519 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_280582e1-1357-4079-bf39-4328d162cbe8/aodh-api/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.497646 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_280582e1-1357-4079-bf39-4328d162cbe8/aodh-listener/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.544319 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_280582e1-1357-4079-bf39-4328d162cbe8/aodh-notifier/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.569398 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7457696574-f27s8_2538afb4-ca93-4611-aa92-034f134c476d/barbican-api/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.720082 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7457696574-f27s8_2538afb4-ca93-4611-aa92-034f134c476d/barbican-api-log/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.806105 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-85b957594-kjrbm_f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe/barbican-keystone-listener-log/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.806802 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-85b957594-kjrbm_f6a8b1e1-4fe0-426a-acf7-9d74a1271dbe/barbican-keystone-listener/0.log" Dec 09 06:02:22 crc kubenswrapper[4766]: I1209 06:02:22.967317 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54f9c8697f-tdwlg_cece7620-62bd-44f9-8fad-c527e39ab2ee/barbican-worker/0.log" Dec 09 06:02:23 crc kubenswrapper[4766]: I1209 06:02:23.030977 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54f9c8697f-tdwlg_cece7620-62bd-44f9-8fad-c527e39ab2ee/barbican-worker-log/0.log" Dec 09 06:02:23 crc kubenswrapper[4766]: I1209 06:02:23.237232 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-7nrh9_ac75c2a4-56e0-482c-94f6-919a66db4f5e/bootstrap-openstack-openstack-cell1/0.log" Dec 09 06:02:23 crc kubenswrapper[4766]: I1209 06:02:23.278493 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86691648-ed71-473b-9f1a-51b0982e140a/ceilometer-central-agent/0.log" Dec 09 06:02:23 crc kubenswrapper[4766]: I1209 06:02:23.411490 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86691648-ed71-473b-9f1a-51b0982e140a/ceilometer-notification-agent/0.log" Dec 09 06:02:23 crc kubenswrapper[4766]: I1209 06:02:23.444657 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86691648-ed71-473b-9f1a-51b0982e140a/sg-core/0.log" Dec 09 06:02:23 crc kubenswrapper[4766]: I1209 06:02:23.477751 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86691648-ed71-473b-9f1a-51b0982e140a/proxy-httpd/0.log" Dec 09 06:02:24 crc kubenswrapper[4766]: I1209 06:02:24.133920 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-vg2pg_8e847b86-2a1f-426f-bf4f-c7739ce8b65b/ceph-client-openstack-openstack-cell1/0.log" Dec 09 06:02:24 crc kubenswrapper[4766]: I1209 06:02:24.179912 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5057c712-9d00-4dcf-80ba-f67a171b0828/cinder-api-log/0.log" Dec 09 06:02:24 crc kubenswrapper[4766]: I1209 06:02:24.287583 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5057c712-9d00-4dcf-80ba-f67a171b0828/cinder-api/0.log" Dec 09 06:02:24 crc kubenswrapper[4766]: I1209 06:02:24.530798 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_89875ded-a9cb-4747-805a-15d7720291f6/probe/0.log" Dec 09 06:02:24 crc kubenswrapper[4766]: I1209 06:02:24.542525 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_89875ded-a9cb-4747-805a-15d7720291f6/cinder-backup/0.log" Dec 09 06:02:24 crc kubenswrapper[4766]: I1209 06:02:24.703474 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d2b2de8b-3edd-4831-8c78-df96f83cceef/cinder-scheduler/0.log" Dec 09 06:02:24 crc kubenswrapper[4766]: I1209 06:02:24.797832 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d2b2de8b-3edd-4831-8c78-df96f83cceef/probe/0.log" Dec 09 06:02:24 crc kubenswrapper[4766]: I1209 06:02:24.868001 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_e5cb8425-9990-4aa8-8d37-c6276ef64bb4/cinder-volume/0.log" Dec 09 06:02:24 crc kubenswrapper[4766]: I1209 06:02:24.978189 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_e5cb8425-9990-4aa8-8d37-c6276ef64bb4/probe/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.070229 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-f2kbb_ca0814c4-dca6-4dbd-841b-79d94d5dabd5/configure-network-openstack-openstack-cell1/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.222684 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-gvx4s_ea6ef63b-4223-4237-9ba9-d389f0b11cc5/configure-os-openstack-openstack-cell1/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.266133 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cdbcddd87-rz29w_973b0c8d-4740-4361-99a9-aeb033dd98f5/init/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.508512 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cdbcddd87-rz29w_973b0c8d-4740-4361-99a9-aeb033dd98f5/init/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.545993 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cdbcddd87-rz29w_973b0c8d-4740-4361-99a9-aeb033dd98f5/dnsmasq-dns/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.619985 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-nmbq8_03e6ab22-c008-415b-8d9d-08ca8d4b3379/download-cache-openstack-openstack-cell1/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.806154 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5b4085a1-eb3c-453f-a519-9901c33de716/glance-log/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.820904 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5b4085a1-eb3c-453f-a519-9901c33de716/glance-httpd/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.912747 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9d2658e-5ee6-4962-81c3-5e7406d32b6f/glance-log/0.log" Dec 09 06:02:25 crc kubenswrapper[4766]: I1209 06:02:25.948295 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9d2658e-5ee6-4962-81c3-5e7406d32b6f/glance-httpd/0.log" Dec 09 06:02:26 crc kubenswrapper[4766]: I1209 06:02:26.155505 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-75dd6ffcd6-x4r8x_b17e303d-fda5-45e4-a409-56c58f010d9e/heat-api/0.log" Dec 09 06:02:26 crc kubenswrapper[4766]: I1209 06:02:26.350875 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-674c694cd8-x85ld_4ac6d93b-846d-4e73-a338-172101de6d3c/heat-engine/0.log" Dec 09 06:02:26 crc kubenswrapper[4766]: I1209 06:02:26.352382 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-9dcdf99c4-hcdzh_93f7bf6c-55b6-4a13-ab8e-7b784ee2cdac/heat-cfnapi/0.log" Dec 09 06:02:26 crc kubenswrapper[4766]: I1209 06:02:26.501413 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b6cc9f797-qt9w7_d6f4aeda-1525-486a-83d0-9cb677713681/horizon/0.log" Dec 09 06:02:26 crc kubenswrapper[4766]: I1209 06:02:26.577388 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-ntld8_85b25963-4e2f-4fa9-828a-209e27d8c39b/install-certs-openstack-openstack-cell1/0.log" Dec 09 06:02:26 crc kubenswrapper[4766]: I1209 06:02:26.593151 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b6cc9f797-qt9w7_d6f4aeda-1525-486a-83d0-9cb677713681/horizon-log/0.log" Dec 09 06:02:26 crc kubenswrapper[4766]: I1209 06:02:26.706842 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-zr46j_65aff17e-b3a1-487f-98fc-c6a81aaf1029/install-os-openstack-openstack-cell1/0.log" Dec 09 06:02:26 crc kubenswrapper[4766]: I1209 06:02:26.940922 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8b8987d78-zlv6w_c4c26389-5cc4-421f-b5f4-6135179a1ef7/keystone-api/0.log" Dec 09 06:02:26 crc kubenswrapper[4766]: I1209 06:02:26.983392 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29420941-cct7p_612474e2-a120-42c9-b73d-b3bef7c80036/keystone-cron/0.log" Dec 09 06:02:27 crc kubenswrapper[4766]: I1209 06:02:27.079017 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29421001-5q7md_f6e655b6-f2b9-4866-9ff9-98f2d500b061/keystone-cron/0.log" Dec 09 06:02:27 crc kubenswrapper[4766]: I1209 06:02:27.240865 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_0bbf5934-e11a-4118-afde-57ecc9d95cd0/kube-state-metrics/0.log" Dec 09 06:02:27 crc kubenswrapper[4766]: I1209 06:02:27.352937 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-r2jnr_0ef78455-720c-4e11-b479-4f665656e20b/libvirt-openstack-openstack-cell1/0.log" Dec 09 06:02:27 crc kubenswrapper[4766]: I1209 06:02:27.438051 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8b487089-eff1-45b3-b892-26574d04acfe/manila-api-log/0.log" Dec 09 06:02:27 crc kubenswrapper[4766]: I1209 06:02:27.541078 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_8b487089-eff1-45b3-b892-26574d04acfe/manila-api/0.log" Dec 09 06:02:27 crc kubenswrapper[4766]: I1209 06:02:27.614141 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8a74bac3-3464-4c09-bec4-bd0d2e0e88f9/manila-scheduler/0.log" Dec 09 06:02:27 crc kubenswrapper[4766]: I1209 06:02:27.668132 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_8a74bac3-3464-4c09-bec4-bd0d2e0e88f9/probe/0.log" Dec 09 06:02:27 crc kubenswrapper[4766]: I1209 06:02:27.793059 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_6458ca19-4210-498d-8022-305da98f2544/manila-share/0.log" Dec 09 06:02:27 crc kubenswrapper[4766]: I1209 06:02:27.854275 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_6458ca19-4210-498d-8022-305da98f2544/probe/0.log" Dec 09 06:02:28 crc kubenswrapper[4766]: I1209 06:02:28.136018 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b4db9d857-5hs6p_d1456b18-0735-4283-b268-f40d1fd42634/neutron-httpd/0.log" Dec 09 06:02:28 crc kubenswrapper[4766]: I1209 06:02:28.188813 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b4db9d857-5hs6p_d1456b18-0735-4283-b268-f40d1fd42634/neutron-api/0.log" Dec 09 06:02:28 crc kubenswrapper[4766]: I1209 06:02:28.322283 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-cpxph_c04eb3ac-c0d3-470a-a741-3e8369270d60/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 09 06:02:28 crc kubenswrapper[4766]: I1209 06:02:28.534805 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-gkrmn_229ed8f5-5e3b-4f6b-8dbb-860af1016c20/neutron-metadata-openstack-openstack-cell1/0.log" Dec 09 06:02:28 crc kubenswrapper[4766]: I1209 06:02:28.744035 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-k6sgx_1992f13d-f848-4cef-a27c-f464d65b48f2/neutron-sriov-openstack-openstack-cell1/0.log" Dec 09 06:02:28 crc kubenswrapper[4766]: I1209 06:02:28.840566 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_23d93ca5-3499-4896-9d83-4641813d0844/nova-api-api/0.log" Dec 09 06:02:28 crc kubenswrapper[4766]: I1209 06:02:28.929469 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_23d93ca5-3499-4896-9d83-4641813d0844/nova-api-log/0.log" Dec 09 06:02:29 crc kubenswrapper[4766]: I1209 06:02:29.052921 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a70e4b72-a141-4697-83de-8c3fbf6f9c58/nova-cell0-conductor-conductor/0.log" Dec 09 06:02:29 crc kubenswrapper[4766]: I1209 06:02:29.529361 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_714ab1b8-4018-410b-893f-a4acee38b1ca/nova-cell1-conductor-conductor/0.log" Dec 09 06:02:29 crc kubenswrapper[4766]: I1209 06:02:29.595342 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_57a97579-3281-4578-9f14-3429d802e69c/nova-cell1-novncproxy-novncproxy/0.log" Dec 09 06:02:29 crc kubenswrapper[4766]: I1209 06:02:29.737201 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellg4v7v_8683912f-3cfa-4505-abfc-49943b3965c7/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 09 06:02:29 crc kubenswrapper[4766]: I1209 06:02:29.931778 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-hjl6d_5ff84ae1-d39e-43fb-8ceb-2bd935d486b9/nova-cell1-openstack-openstack-cell1/0.log" Dec 09 06:02:30 crc kubenswrapper[4766]: I1209 06:02:30.046963 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_37589400-9a3a-4acc-baae-2ff3d1f09e30/nova-metadata-log/0.log" Dec 09 06:02:30 crc kubenswrapper[4766]: I1209 06:02:30.112040 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_37589400-9a3a-4acc-baae-2ff3d1f09e30/nova-metadata-metadata/0.log" Dec 09 06:02:30 crc kubenswrapper[4766]: I1209 06:02:30.311027 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e73e246e-ca55-4721-8836-da091b7cb32d/nova-scheduler-scheduler/0.log" Dec 09 06:02:30 crc kubenswrapper[4766]: I1209 06:02:30.354424 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-544d8574f4-hrpph_e8178831-8e45-4026-9a23-d402c776fccd/init/0.log" Dec 09 06:02:30 crc kubenswrapper[4766]: I1209 06:02:30.611457 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-544d8574f4-hrpph_e8178831-8e45-4026-9a23-d402c776fccd/init/0.log" Dec 09 06:02:30 crc kubenswrapper[4766]: I1209 06:02:30.619688 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-544d8574f4-hrpph_e8178831-8e45-4026-9a23-d402c776fccd/octavia-api-provider-agent/0.log" Dec 09 06:02:30 crc kubenswrapper[4766]: I1209 06:02:30.858285 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-w75w9_42c16a26-9345-40e9-b970-01eadcb1e1cf/init/0.log" Dec 09 06:02:30 crc kubenswrapper[4766]: I1209 06:02:30.916820 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-544d8574f4-hrpph_e8178831-8e45-4026-9a23-d402c776fccd/octavia-api/0.log" Dec 09 06:02:31 crc kubenswrapper[4766]: I1209 06:02:31.057442 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-w75w9_42c16a26-9345-40e9-b970-01eadcb1e1cf/init/0.log" Dec 09 06:02:31 crc kubenswrapper[4766]: I1209 06:02:31.158129 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-w75w9_42c16a26-9345-40e9-b970-01eadcb1e1cf/octavia-healthmanager/0.log" Dec 09 06:02:31 crc kubenswrapper[4766]: I1209 06:02:31.164921 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-b69rx_707b62f5-7629-49cd-9cc3-a2a73d21ea70/init/0.log" Dec 09 06:02:31 crc kubenswrapper[4766]: I1209 06:02:31.393064 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-b69rx_707b62f5-7629-49cd-9cc3-a2a73d21ea70/init/0.log" Dec 09 06:02:31 crc kubenswrapper[4766]: I1209 06:02:31.436274 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-b69rx_707b62f5-7629-49cd-9cc3-a2a73d21ea70/octavia-housekeeping/0.log" Dec 09 06:02:31 crc kubenswrapper[4766]: I1209 06:02:31.487642 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-46tzk_26760c25-6203-40b2-8302-23310f358a65/init/0.log" Dec 09 06:02:31 crc kubenswrapper[4766]: I1209 06:02:31.661381 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-46tzk_26760c25-6203-40b2-8302-23310f358a65/init/0.log" Dec 09 06:02:31 crc kubenswrapper[4766]: I1209 06:02:31.686248 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-46tzk_26760c25-6203-40b2-8302-23310f358a65/octavia-amphora-httpd/0.log" Dec 09 06:02:31 crc kubenswrapper[4766]: I1209 06:02:31.793981 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-wh6j7_d29a41da-acab-46d3-bc4c-df381d85613c/init/0.log" Dec 09 06:02:32 crc kubenswrapper[4766]: I1209 06:02:32.028406 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-wh6j7_d29a41da-acab-46d3-bc4c-df381d85613c/init/0.log" Dec 09 06:02:32 crc kubenswrapper[4766]: I1209 06:02:32.074369 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-wh6j7_d29a41da-acab-46d3-bc4c-df381d85613c/octavia-rsyslog/0.log" Dec 09 06:02:32 crc kubenswrapper[4766]: I1209 06:02:32.147908 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-78vr2_ae9d7a5e-caed-457f-9a5c-56faf7db6547/init/0.log" Dec 09 06:02:32 crc kubenswrapper[4766]: I1209 06:02:32.444452 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-78vr2_ae9d7a5e-caed-457f-9a5c-56faf7db6547/init/0.log" Dec 09 06:02:32 crc kubenswrapper[4766]: I1209 06:02:32.466453 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ad2a4151-7484-4aad-8fc9-e7c7bbbb753c/mysql-bootstrap/0.log" Dec 09 06:02:32 crc kubenswrapper[4766]: I1209 06:02:32.582127 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-78vr2_ae9d7a5e-caed-457f-9a5c-56faf7db6547/octavia-worker/0.log" Dec 09 06:02:32 crc kubenswrapper[4766]: I1209 06:02:32.695015 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ad2a4151-7484-4aad-8fc9-e7c7bbbb753c/mysql-bootstrap/0.log" Dec 09 06:02:32 crc kubenswrapper[4766]: I1209 06:02:32.701419 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ad2a4151-7484-4aad-8fc9-e7c7bbbb753c/galera/0.log" Dec 09 06:02:32 crc kubenswrapper[4766]: I1209 06:02:32.837845 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_970b977f-fe90-4458-9acd-bef89b225b9d/mysql-bootstrap/0.log" Dec 09 06:02:33 crc kubenswrapper[4766]: I1209 06:02:33.073913 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_970b977f-fe90-4458-9acd-bef89b225b9d/mysql-bootstrap/0.log" Dec 09 06:02:33 crc kubenswrapper[4766]: I1209 06:02:33.114852 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8d7295db-9761-4829-93af-705e9af64b1c/openstackclient/0.log" Dec 09 06:02:33 crc kubenswrapper[4766]: I1209 06:02:33.139183 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_970b977f-fe90-4458-9acd-bef89b225b9d/galera/0.log" Dec 09 06:02:33 crc kubenswrapper[4766]: I1209 06:02:33.349475 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-94xfz_7415e725-513c-4557-8d04-62a9c452ec6c/ovn-controller/0.log" Dec 09 06:02:33 crc kubenswrapper[4766]: I1209 06:02:33.521962 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4kszh_41fc5d08-4777-4c4a-b7f4-3efb34963689/openstack-network-exporter/0.log" Dec 09 06:02:33 crc kubenswrapper[4766]: I1209 06:02:33.619821 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zx2fg_0572d99b-c46f-4734-bf6f-72c01b8a8e84/ovsdb-server-init/0.log" Dec 09 06:02:33 crc kubenswrapper[4766]: I1209 06:02:33.853821 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zx2fg_0572d99b-c46f-4734-bf6f-72c01b8a8e84/ovsdb-server-init/0.log" Dec 09 06:02:33 crc kubenswrapper[4766]: I1209 06:02:33.867775 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zx2fg_0572d99b-c46f-4734-bf6f-72c01b8a8e84/ovs-vswitchd/0.log" Dec 09 06:02:33 crc kubenswrapper[4766]: I1209 06:02:33.883310 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zx2fg_0572d99b-c46f-4734-bf6f-72c01b8a8e84/ovsdb-server/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.118316 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_71b809cd-dee1-42e4-a46a-8c7f9c067678/openstack-network-exporter/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.127893 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_71b809cd-dee1-42e4-a46a-8c7f9c067678/ovn-northd/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.225277 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-vk8d7_1dc3ce4a-ffe5-4d29-a523-9ff6591c0b5f/ovn-openstack-openstack-cell1/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.416470 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_23400620-2f42-40c5-8292-17c64ab567ea/ovsdbserver-nb/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.435725 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_23400620-2f42-40c5-8292-17c64ab567ea/openstack-network-exporter/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.628481 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_be8365e2-d3eb-4562-a24a-6dfb4342cfd5/openstack-network-exporter/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.702232 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_be8365e2-d3eb-4562-a24a-6dfb4342cfd5/ovsdbserver-nb/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.735758 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_85d69681-4b38-42ca-9e60-93fb02d90e75/openstack-network-exporter/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.844250 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_85d69681-4b38-42ca-9e60-93fb02d90e75/ovsdbserver-nb/0.log" Dec 09 06:02:34 crc kubenswrapper[4766]: I1209 06:02:34.931258 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_259ef0b7-117e-4842-8af1-2038ecd40d47/openstack-network-exporter/0.log" Dec 09 06:02:35 crc kubenswrapper[4766]: I1209 06:02:35.034953 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_259ef0b7-117e-4842-8af1-2038ecd40d47/ovsdbserver-sb/0.log" Dec 09 06:02:35 crc kubenswrapper[4766]: I1209 06:02:35.154957 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6/openstack-network-exporter/0.log" Dec 09 06:02:35 crc kubenswrapper[4766]: I1209 06:02:35.245688 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_3d9fdceb-c59e-4d47-8dc1-0d6bdcb748d6/ovsdbserver-sb/0.log" Dec 09 06:02:35 crc kubenswrapper[4766]: I1209 06:02:35.379168 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_d16ac2e2-75aa-4533-8fc6-4e0e8e006b14/openstack-network-exporter/0.log" Dec 09 06:02:35 crc kubenswrapper[4766]: I1209 06:02:35.444528 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_d16ac2e2-75aa-4533-8fc6-4e0e8e006b14/ovsdbserver-sb/0.log" Dec 09 06:02:35 crc kubenswrapper[4766]: I1209 06:02:35.657860 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bf6488d8b-5q42s_246c2111-735d-43b7-bb05-a0566897889e/placement-api/0.log" Dec 09 06:02:35 crc kubenswrapper[4766]: I1209 06:02:35.785360 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cs886b_90e77954-e1ed-4e5d-97c1-bdd8e94841b7/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 09 06:02:35 crc kubenswrapper[4766]: I1209 06:02:35.791940 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bf6488d8b-5q42s_246c2111-735d-43b7-bb05-a0566897889e/placement-log/0.log" Dec 09 06:02:35 crc kubenswrapper[4766]: I1209 06:02:35.976695 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_375a1ede-7b02-4a66-aed0-02666883a8f2/init-config-reloader/0.log" Dec 09 06:02:36 crc kubenswrapper[4766]: I1209 06:02:36.226960 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_375a1ede-7b02-4a66-aed0-02666883a8f2/init-config-reloader/0.log" Dec 09 06:02:36 crc kubenswrapper[4766]: I1209 06:02:36.242007 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_375a1ede-7b02-4a66-aed0-02666883a8f2/config-reloader/0.log" Dec 09 06:02:36 crc kubenswrapper[4766]: I1209 06:02:36.282055 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_375a1ede-7b02-4a66-aed0-02666883a8f2/thanos-sidecar/0.log" Dec 09 06:02:36 crc kubenswrapper[4766]: I1209 06:02:36.282910 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_375a1ede-7b02-4a66-aed0-02666883a8f2/prometheus/0.log" Dec 09 06:02:36 crc kubenswrapper[4766]: I1209 06:02:36.479124 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bdcb35ea-4aea-4b76-a13e-6fd4b9398991/setup-container/0.log" Dec 09 06:02:36 crc kubenswrapper[4766]: I1209 06:02:36.648662 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bdcb35ea-4aea-4b76-a13e-6fd4b9398991/setup-container/0.log" Dec 09 06:02:36 crc kubenswrapper[4766]: I1209 06:02:36.731066 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bdcb35ea-4aea-4b76-a13e-6fd4b9398991/rabbitmq/0.log" Dec 09 06:02:36 crc kubenswrapper[4766]: I1209 06:02:36.792991 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_74a6e9a7-3db2-49eb-bbaf-40536dcc8d88/setup-container/0.log" Dec 09 06:02:37 crc kubenswrapper[4766]: I1209 06:02:37.038265 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_74a6e9a7-3db2-49eb-bbaf-40536dcc8d88/setup-container/0.log" Dec 09 06:02:37 crc kubenswrapper[4766]: I1209 06:02:37.127633 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-fzkl6_ce33c619-39af-44a4-9643-a60c928971b5/reboot-os-openstack-openstack-cell1/0.log" Dec 09 06:02:37 crc kubenswrapper[4766]: I1209 06:02:37.368961 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_74a6e9a7-3db2-49eb-bbaf-40536dcc8d88/rabbitmq/0.log" Dec 09 06:02:37 crc kubenswrapper[4766]: I1209 06:02:37.389260 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-sksdt_fa7129ae-c7da-4b11-ade0-9d16ae5373d5/run-os-openstack-openstack-cell1/0.log" Dec 09 06:02:37 crc kubenswrapper[4766]: I1209 06:02:37.581365 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-vpkgx_322ba64a-2659-4d62-b831-219df10e7c72/ssh-known-hosts-openstack/0.log" Dec 09 06:02:37 crc kubenswrapper[4766]: I1209 06:02:37.668857 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-nskww_2cc13e31-7024-4112-bddd-c79864a1df19/telemetry-openstack-openstack-cell1/0.log" Dec 09 06:02:37 crc kubenswrapper[4766]: I1209 06:02:37.854982 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-x6w9q_b536acce-84ce-48e5-b47e-9c68a26c82a7/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 09 06:02:38 crc kubenswrapper[4766]: I1209 06:02:38.411993 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-gmrnw_80ed47d3-1b67-46bf-a503-75452343db85/validate-network-openstack-openstack-cell1/0.log" Dec 09 06:02:38 crc kubenswrapper[4766]: I1209 06:02:38.806530 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4e13dec0-38aa-4f9f-944a-f96185e640eb/memcached/0.log" Dec 09 06:03:01 crc kubenswrapper[4766]: I1209 06:03:01.287579 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-gc98f_ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0/kube-rbac-proxy/0.log" Dec 09 06:03:01 crc kubenswrapper[4766]: I1209 06:03:01.395425 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-gc98f_ea730dd4-4dcc-4bea-a174-1f4bb43a7ee0/manager/0.log" Dec 09 06:03:01 crc kubenswrapper[4766]: I1209 06:03:01.561675 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-vvdmw_ac56e057-6439-40ab-bb04-ce228a828444/kube-rbac-proxy/0.log" Dec 09 06:03:01 crc kubenswrapper[4766]: I1209 06:03:01.590834 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-vvdmw_ac56e057-6439-40ab-bb04-ce228a828444/manager/0.log" Dec 09 06:03:01 crc kubenswrapper[4766]: I1209 06:03:01.724805 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-p2sks_bb6ed96a-0045-42ea-a13e-dc4a82714b9d/kube-rbac-proxy/0.log" Dec 09 06:03:01 crc kubenswrapper[4766]: I1209 06:03:01.828638 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-p2sks_bb6ed96a-0045-42ea-a13e-dc4a82714b9d/manager/0.log" Dec 09 06:03:01 crc kubenswrapper[4766]: I1209 06:03:01.869508 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9_dac3aa15-44b2-4d5f-bd77-94928a5e3a62/util/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.022557 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9_dac3aa15-44b2-4d5f-bd77-94928a5e3a62/util/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.032084 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9_dac3aa15-44b2-4d5f-bd77-94928a5e3a62/pull/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.057547 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9_dac3aa15-44b2-4d5f-bd77-94928a5e3a62/pull/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.215061 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9_dac3aa15-44b2-4d5f-bd77-94928a5e3a62/util/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.240282 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9_dac3aa15-44b2-4d5f-bd77-94928a5e3a62/pull/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.261064 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e9deb3aa6be3dfcc9cdd111a6d9b6924493bcb3a1757bbc6dae501f87bhmbj9_dac3aa15-44b2-4d5f-bd77-94928a5e3a62/extract/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.424630 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-89h67_95b141d2-5fb4-46f6-b5af-92720c9be11c/kube-rbac-proxy/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.546099 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5z662_146239d5-9360-4fa2-8a76-01743455b5f1/kube-rbac-proxy/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.556348 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-89h67_95b141d2-5fb4-46f6-b5af-92720c9be11c/manager/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.684305 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5z662_146239d5-9360-4fa2-8a76-01743455b5f1/manager/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.788939 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6rt97_735b0ee4-6bb2-41f4-b6e9-494e5d73b584/kube-rbac-proxy/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.797253 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6rt97_735b0ee4-6bb2-41f4-b6e9-494e5d73b584/manager/0.log" Dec 09 06:03:02 crc kubenswrapper[4766]: I1209 06:03:02.974307 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-sn9z2_ef10710a-81c7-46f4-8c5c-3aabdc02a833/kube-rbac-proxy/0.log" Dec 09 06:03:03 crc kubenswrapper[4766]: I1209 06:03:03.208533 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-ng459_e8d80517-4f1e-4123-b002-e7990cc9c945/kube-rbac-proxy/0.log" Dec 09 06:03:03 crc kubenswrapper[4766]: I1209 06:03:03.227554 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-ng459_e8d80517-4f1e-4123-b002-e7990cc9c945/manager/0.log" Dec 09 06:03:03 crc kubenswrapper[4766]: I1209 06:03:03.462114 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-7cdxh_50ba0df7-74df-4798-a0c0-39dda9c4e3ef/kube-rbac-proxy/0.log" Dec 09 06:03:03 crc kubenswrapper[4766]: I1209 06:03:03.524480 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-sn9z2_ef10710a-81c7-46f4-8c5c-3aabdc02a833/manager/0.log" Dec 09 06:03:03 crc kubenswrapper[4766]: I1209 06:03:03.596931 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-7cdxh_50ba0df7-74df-4798-a0c0-39dda9c4e3ef/manager/0.log" Dec 09 06:03:03 crc kubenswrapper[4766]: I1209 06:03:03.699342 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-shhtg_b6f43661-b796-4f31-aaa9-482f85952578/kube-rbac-proxy/0.log" Dec 09 06:03:03 crc kubenswrapper[4766]: I1209 06:03:03.816314 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-shhtg_b6f43661-b796-4f31-aaa9-482f85952578/manager/0.log" Dec 09 06:03:03 crc kubenswrapper[4766]: I1209 06:03:03.817911 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-mwbr2_3467a8bf-3f6b-49ce-9e3a-5d834456bbaf/kube-rbac-proxy/0.log" Dec 09 06:03:03 crc kubenswrapper[4766]: I1209 06:03:03.983057 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-mwbr2_3467a8bf-3f6b-49ce-9e3a-5d834456bbaf/manager/0.log" Dec 09 06:03:04 crc kubenswrapper[4766]: I1209 06:03:04.022735 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dgds5_14e6b34f-8412-44ea-8342-0350ccf7f7c9/kube-rbac-proxy/0.log" Dec 09 06:03:04 crc kubenswrapper[4766]: I1209 06:03:04.150564 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-dgds5_14e6b34f-8412-44ea-8342-0350ccf7f7c9/manager/0.log" Dec 09 06:03:04 crc kubenswrapper[4766]: I1209 06:03:04.226525 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-qfdx4_3b4cd02e-a82d-48d9-8078-9cdf3a65767c/kube-rbac-proxy/0.log" Dec 09 06:03:04 crc kubenswrapper[4766]: I1209 06:03:04.421809 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-qfdx4_3b4cd02e-a82d-48d9-8078-9cdf3a65767c/manager/0.log" Dec 09 06:03:04 crc kubenswrapper[4766]: I1209 06:03:04.474712 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-rb9kw_d1aa5579-7f42-4ee2-af62-3b2a7d391101/kube-rbac-proxy/0.log" Dec 09 06:03:04 crc kubenswrapper[4766]: I1209 06:03:04.553657 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-rb9kw_d1aa5579-7f42-4ee2-af62-3b2a7d391101/manager/0.log" Dec 09 06:03:04 crc kubenswrapper[4766]: I1209 06:03:04.640474 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fbrcdj_b2580526-5f4a-4340-9856-bc78a320e610/kube-rbac-proxy/0.log" Dec 09 06:03:04 crc kubenswrapper[4766]: I1209 06:03:04.690137 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fbrcdj_b2580526-5f4a-4340-9856-bc78a320e610/manager/0.log" Dec 09 06:03:05 crc kubenswrapper[4766]: I1209 06:03:05.110955 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-h2hcg_fd134238-f047-4601-a2e0-58781bdb521b/registry-server/0.log" Dec 09 06:03:05 crc kubenswrapper[4766]: I1209 06:03:05.201703 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6799c88b79-8lfhp_aa308d1e-c51d-4942-b7a0-a5b29e0649f0/operator/0.log" Dec 09 06:03:05 crc kubenswrapper[4766]: I1209 06:03:05.316907 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-ln78g_0d1a15a1-afd0-41a2-bf4d-21d98d4730b4/kube-rbac-proxy/0.log" Dec 09 06:03:05 crc kubenswrapper[4766]: I1209 06:03:05.456562 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rllsr_636d7080-c310-4a2a-a07c-ef0aab6412ba/kube-rbac-proxy/0.log" Dec 09 06:03:05 crc kubenswrapper[4766]: I1209 06:03:05.527434 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-ln78g_0d1a15a1-afd0-41a2-bf4d-21d98d4730b4/manager/0.log" Dec 09 06:03:05 crc kubenswrapper[4766]: I1209 06:03:05.662347 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rllsr_636d7080-c310-4a2a-a07c-ef0aab6412ba/manager/0.log" Dec 09 06:03:05 crc kubenswrapper[4766]: I1209 06:03:05.834924 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dxctt_581540bb-4d83-41b5-9384-e91238383025/operator/0.log" Dec 09 06:03:05 crc kubenswrapper[4766]: I1209 06:03:05.906739 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-whnqg_3aea0f89-2c0b-4705-92cf-17a37169675e/manager/0.log" Dec 09 06:03:05 crc kubenswrapper[4766]: I1209 06:03:05.914452 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-whnqg_3aea0f89-2c0b-4705-92cf-17a37169675e/kube-rbac-proxy/0.log" Dec 09 06:03:06 crc kubenswrapper[4766]: I1209 06:03:06.019087 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-fk6b4_f911184d-8572-4bc3-abbd-b76245ce463c/kube-rbac-proxy/0.log" Dec 09 06:03:06 crc kubenswrapper[4766]: I1209 06:03:06.271459 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-62wlz_4fa83023-eb1d-4ab9-b80a-b9bd04d342a3/manager/0.log" Dec 09 06:03:06 crc kubenswrapper[4766]: I1209 06:03:06.307476 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-fk6b4_f911184d-8572-4bc3-abbd-b76245ce463c/manager/0.log" Dec 09 06:03:06 crc kubenswrapper[4766]: I1209 06:03:06.331065 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-62wlz_4fa83023-eb1d-4ab9-b80a-b9bd04d342a3/kube-rbac-proxy/0.log" Dec 09 06:03:06 crc kubenswrapper[4766]: I1209 06:03:06.462252 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-74tq4_c3606954-b281-4fa3-b96a-a62ab4092a78/kube-rbac-proxy/0.log" Dec 09 06:03:06 crc kubenswrapper[4766]: I1209 06:03:06.543035 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-74tq4_c3606954-b281-4fa3-b96a-a62ab4092a78/manager/0.log" Dec 09 06:03:07 crc kubenswrapper[4766]: I1209 06:03:07.188041 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-646dd6f965-mkjpj_a315c06a-893b-4f1b-9b0e-0120afcfeb00/manager/0.log" Dec 09 06:03:26 crc kubenswrapper[4766]: I1209 06:03:26.798364 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-vt8xz_d0581951-bbb8-4550-8ced-7c1431f6deb8/control-plane-machine-set-operator/0.log" Dec 09 06:03:27 crc kubenswrapper[4766]: I1209 06:03:27.034702 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fbnmb_56991f65-9178-42a0-ba48-3a53256cd715/kube-rbac-proxy/0.log" Dec 09 06:03:27 crc kubenswrapper[4766]: I1209 06:03:27.044591 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fbnmb_56991f65-9178-42a0-ba48-3a53256cd715/machine-api-operator/0.log" Dec 09 06:03:42 crc kubenswrapper[4766]: I1209 06:03:42.105742 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-zb5bt_ff28ddcc-8cd5-4f2e-bb76-0d23db11d40b/cert-manager-controller/0.log" Dec 09 06:03:42 crc kubenswrapper[4766]: I1209 06:03:42.307098 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-ljgrb_0d683a6a-fcb6-4476-9a2d-e1905c04d0cb/cert-manager-cainjector/0.log" Dec 09 06:03:42 crc kubenswrapper[4766]: I1209 06:03:42.405805 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-g9vlz_80e97801-4f98-4f32-b540-c9edb2ad39a9/cert-manager-webhook/0.log" Dec 09 06:03:56 crc kubenswrapper[4766]: I1209 06:03:56.246709 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-sfpqb_e456122d-6335-472c-a630-18cbf88d4352/nmstate-console-plugin/0.log" Dec 09 06:03:56 crc kubenswrapper[4766]: I1209 06:03:56.364677 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-j2cd4_9cba229b-a520-4e88-bf57-d2418eedc7bd/nmstate-handler/0.log" Dec 09 06:03:56 crc kubenswrapper[4766]: I1209 06:03:56.430435 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-m9cmd_f1348766-b8a2-4754-88b9-078bf2081465/kube-rbac-proxy/0.log" Dec 09 06:03:56 crc kubenswrapper[4766]: I1209 06:03:56.473545 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-m9cmd_f1348766-b8a2-4754-88b9-078bf2081465/nmstate-metrics/0.log" Dec 09 06:03:56 crc kubenswrapper[4766]: I1209 06:03:56.652749 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-wt8wd_c191369f-2896-4aa5-9a85-19fe4ba4ba6d/nmstate-webhook/0.log" Dec 09 06:03:56 crc kubenswrapper[4766]: I1209 06:03:56.676053 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-dcnxv_2c9cee8d-ebaf-416e-8dde-e9bfddf9775a/nmstate-operator/0.log" Dec 09 06:04:07 crc kubenswrapper[4766]: I1209 06:04:07.317096 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 06:04:07 crc kubenswrapper[4766]: I1209 06:04:07.317663 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 06:04:14 crc kubenswrapper[4766]: I1209 06:04:14.198698 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-n58vw_8b9c8d01-0ad9-4730-b328-9d67fb4322ba/kube-rbac-proxy/0.log" Dec 09 06:04:14 crc kubenswrapper[4766]: I1209 06:04:14.449383 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-frr-files/0.log" Dec 09 06:04:14 crc kubenswrapper[4766]: I1209 06:04:14.679318 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-reloader/0.log" Dec 09 06:04:14 crc kubenswrapper[4766]: I1209 06:04:14.688225 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-n58vw_8b9c8d01-0ad9-4730-b328-9d67fb4322ba/controller/0.log" Dec 09 06:04:14 crc kubenswrapper[4766]: I1209 06:04:14.702704 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-metrics/0.log" Dec 09 06:04:14 crc kubenswrapper[4766]: I1209 06:04:14.746127 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-frr-files/0.log" Dec 09 06:04:14 crc kubenswrapper[4766]: I1209 06:04:14.885677 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-reloader/0.log" Dec 09 06:04:15 crc kubenswrapper[4766]: I1209 06:04:15.081649 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-metrics/0.log" Dec 09 06:04:15 crc kubenswrapper[4766]: I1209 06:04:15.113812 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-metrics/0.log" Dec 09 06:04:15 crc kubenswrapper[4766]: I1209 06:04:15.113941 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-reloader/0.log" Dec 09 06:04:15 crc kubenswrapper[4766]: I1209 06:04:15.144744 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-frr-files/0.log" Dec 09 06:04:15 crc kubenswrapper[4766]: I1209 06:04:15.866680 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-metrics/0.log" Dec 09 06:04:15 crc kubenswrapper[4766]: I1209 06:04:15.872781 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-reloader/0.log" Dec 09 06:04:15 crc kubenswrapper[4766]: I1209 06:04:15.875250 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/cp-frr-files/0.log" Dec 09 06:04:15 crc kubenswrapper[4766]: I1209 06:04:15.914083 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/controller/0.log" Dec 09 06:04:16 crc kubenswrapper[4766]: I1209 06:04:16.081455 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/frr-metrics/0.log" Dec 09 06:04:16 crc kubenswrapper[4766]: I1209 06:04:16.123526 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/kube-rbac-proxy/0.log" Dec 09 06:04:16 crc kubenswrapper[4766]: I1209 06:04:16.138768 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/kube-rbac-proxy-frr/0.log" Dec 09 06:04:16 crc kubenswrapper[4766]: I1209 06:04:16.338279 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-r699m_0ee70002-33af-46fd-9fab-162fb47d4b22/frr-k8s-webhook-server/0.log" Dec 09 06:04:16 crc kubenswrapper[4766]: I1209 06:04:16.364317 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/reloader/0.log" Dec 09 06:04:16 crc kubenswrapper[4766]: I1209 06:04:16.613402 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79d558dc88-8rvbz_c75c2686-ab27-4d76-8164-72f00b640297/manager/0.log" Dec 09 06:04:16 crc kubenswrapper[4766]: I1209 06:04:16.857403 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66568d58f-2f5kz_8a754727-b501-415f-9466-8dd5e4600864/webhook-server/0.log" Dec 09 06:04:16 crc kubenswrapper[4766]: I1209 06:04:16.937729 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ftjpq_cdf78232-a230-4c4e-a41e-fd13446f16c1/kube-rbac-proxy/0.log" Dec 09 06:04:18 crc kubenswrapper[4766]: I1209 06:04:18.027958 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ftjpq_cdf78232-a230-4c4e-a41e-fd13446f16c1/speaker/0.log" Dec 09 06:04:19 crc kubenswrapper[4766]: I1209 06:04:19.505492 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-t7t9l_e61cbd09-5f69-4f62-b405-913a0fa68111/frr/0.log" Dec 09 06:04:31 crc kubenswrapper[4766]: I1209 06:04:31.504622 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2_94aaaa56-1eca-4031-a1a9-54f8c7ce4049/util/0.log" Dec 09 06:04:31 crc kubenswrapper[4766]: I1209 06:04:31.682547 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2_94aaaa56-1eca-4031-a1a9-54f8c7ce4049/util/0.log" Dec 09 06:04:31 crc kubenswrapper[4766]: I1209 06:04:31.708481 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2_94aaaa56-1eca-4031-a1a9-54f8c7ce4049/pull/0.log" Dec 09 06:04:31 crc kubenswrapper[4766]: I1209 06:04:31.741830 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2_94aaaa56-1eca-4031-a1a9-54f8c7ce4049/pull/0.log" Dec 09 06:04:31 crc kubenswrapper[4766]: I1209 06:04:31.947606 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2_94aaaa56-1eca-4031-a1a9-54f8c7ce4049/util/0.log" Dec 09 06:04:31 crc kubenswrapper[4766]: I1209 06:04:31.956805 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2_94aaaa56-1eca-4031-a1a9-54f8c7ce4049/pull/0.log" Dec 09 06:04:31 crc kubenswrapper[4766]: I1209 06:04:31.963837 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931afd5r2_94aaaa56-1eca-4031-a1a9-54f8c7ce4049/extract/0.log" Dec 09 06:04:32 crc kubenswrapper[4766]: I1209 06:04:32.111573 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9_63e40c3d-4950-4959-8841-11e962ffc2af/util/0.log" Dec 09 06:04:32 crc kubenswrapper[4766]: I1209 06:04:32.335490 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9_63e40c3d-4950-4959-8841-11e962ffc2af/util/0.log" Dec 09 06:04:32 crc kubenswrapper[4766]: I1209 06:04:32.353662 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9_63e40c3d-4950-4959-8841-11e962ffc2af/pull/0.log" Dec 09 06:04:32 crc kubenswrapper[4766]: I1209 06:04:32.399204 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9_63e40c3d-4950-4959-8841-11e962ffc2af/pull/0.log" Dec 09 06:04:32 crc kubenswrapper[4766]: I1209 06:04:32.529741 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9_63e40c3d-4950-4959-8841-11e962ffc2af/pull/0.log" Dec 09 06:04:32 crc kubenswrapper[4766]: I1209 06:04:32.548970 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9_63e40c3d-4950-4959-8841-11e962ffc2af/util/0.log" Dec 09 06:04:32 crc kubenswrapper[4766]: I1209 06:04:32.580486 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fptvr9_63e40c3d-4950-4959-8841-11e962ffc2af/extract/0.log" Dec 09 06:04:32 crc kubenswrapper[4766]: I1209 06:04:32.960358 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg_ce45094b-177d-4065-a7c0-46a656359ed4/util/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.165274 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg_ce45094b-177d-4065-a7c0-46a656359ed4/util/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.188913 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg_ce45094b-177d-4065-a7c0-46a656359ed4/pull/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.215316 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg_ce45094b-177d-4065-a7c0-46a656359ed4/pull/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.370054 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg_ce45094b-177d-4065-a7c0-46a656359ed4/pull/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.433722 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg_ce45094b-177d-4065-a7c0-46a656359ed4/extract/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.461449 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210sflpg_ce45094b-177d-4065-a7c0-46a656359ed4/util/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.576037 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc_57c50439-8b6a-49d7-8c83-04cf28b100f1/util/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.791960 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc_57c50439-8b6a-49d7-8c83-04cf28b100f1/pull/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.793704 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc_57c50439-8b6a-49d7-8c83-04cf28b100f1/pull/0.log" Dec 09 06:04:33 crc kubenswrapper[4766]: I1209 06:04:33.836175 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc_57c50439-8b6a-49d7-8c83-04cf28b100f1/util/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.020843 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc_57c50439-8b6a-49d7-8c83-04cf28b100f1/util/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.047687 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc_57c50439-8b6a-49d7-8c83-04cf28b100f1/pull/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.072463 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83x6pvc_57c50439-8b6a-49d7-8c83-04cf28b100f1/extract/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.202974 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8hff_327bdf4b-36b9-44f2-8e3e-38fa0fccebf7/extract-utilities/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.421878 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8hff_327bdf4b-36b9-44f2-8e3e-38fa0fccebf7/extract-utilities/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.421993 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8hff_327bdf4b-36b9-44f2-8e3e-38fa0fccebf7/extract-content/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.426968 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8hff_327bdf4b-36b9-44f2-8e3e-38fa0fccebf7/extract-content/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.633733 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8hff_327bdf4b-36b9-44f2-8e3e-38fa0fccebf7/extract-utilities/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.687756 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8hff_327bdf4b-36b9-44f2-8e3e-38fa0fccebf7/extract-content/0.log" Dec 09 06:04:34 crc kubenswrapper[4766]: I1209 06:04:34.846067 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d5d2v_e0dd3ab4-6d2e-43ff-a318-112537d04862/extract-utilities/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.080636 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d5d2v_e0dd3ab4-6d2e-43ff-a318-112537d04862/extract-utilities/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.098762 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d5d2v_e0dd3ab4-6d2e-43ff-a318-112537d04862/extract-content/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.111834 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d5d2v_e0dd3ab4-6d2e-43ff-a318-112537d04862/extract-content/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.411943 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d5d2v_e0dd3ab4-6d2e-43ff-a318-112537d04862/extract-content/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.438686 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d5d2v_e0dd3ab4-6d2e-43ff-a318-112537d04862/extract-utilities/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.591473 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g8hff_327bdf4b-36b9-44f2-8e3e-38fa0fccebf7/registry-server/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.692456 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d5d2v_e0dd3ab4-6d2e-43ff-a318-112537d04862/registry-server/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.720689 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wccgw_b4111b91-5da1-4878-bcd4-2f2b34174e52/marketplace-operator/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.814874 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s4fw9_81c2824b-d083-4153-ad95-e8629b600175/extract-utilities/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.954594 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s4fw9_81c2824b-d083-4153-ad95-e8629b600175/extract-content/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.955090 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s4fw9_81c2824b-d083-4153-ad95-e8629b600175/extract-utilities/0.log" Dec 09 06:04:35 crc kubenswrapper[4766]: I1209 06:04:35.999645 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s4fw9_81c2824b-d083-4153-ad95-e8629b600175/extract-content/0.log" Dec 09 06:04:36 crc kubenswrapper[4766]: I1209 06:04:36.175919 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s4fw9_81c2824b-d083-4153-ad95-e8629b600175/extract-utilities/0.log" Dec 09 06:04:36 crc kubenswrapper[4766]: I1209 06:04:36.228586 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s4fw9_81c2824b-d083-4153-ad95-e8629b600175/extract-content/0.log" Dec 09 06:04:36 crc kubenswrapper[4766]: I1209 06:04:36.339120 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nts28_1a543154-2164-4e7d-ac52-9ee7420ff8ba/extract-utilities/0.log" Dec 09 06:04:36 crc kubenswrapper[4766]: I1209 06:04:36.513429 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nts28_1a543154-2164-4e7d-ac52-9ee7420ff8ba/extract-utilities/0.log" Dec 09 06:04:36 crc kubenswrapper[4766]: I1209 06:04:36.541668 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nts28_1a543154-2164-4e7d-ac52-9ee7420ff8ba/extract-content/0.log" Dec 09 06:04:36 crc kubenswrapper[4766]: I1209 06:04:36.571617 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s4fw9_81c2824b-d083-4153-ad95-e8629b600175/registry-server/0.log" Dec 09 06:04:36 crc kubenswrapper[4766]: I1209 06:04:36.579794 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nts28_1a543154-2164-4e7d-ac52-9ee7420ff8ba/extract-content/0.log" Dec 09 06:04:36 crc kubenswrapper[4766]: I1209 06:04:36.795354 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nts28_1a543154-2164-4e7d-ac52-9ee7420ff8ba/extract-content/0.log" Dec 09 06:04:36 crc kubenswrapper[4766]: I1209 06:04:36.795436 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nts28_1a543154-2164-4e7d-ac52-9ee7420ff8ba/extract-utilities/0.log" Dec 09 06:04:37 crc kubenswrapper[4766]: I1209 06:04:37.183068 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nts28_1a543154-2164-4e7d-ac52-9ee7420ff8ba/registry-server/0.log" Dec 09 06:04:37 crc kubenswrapper[4766]: I1209 06:04:37.316497 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 06:04:37 crc kubenswrapper[4766]: I1209 06:04:37.316543 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 06:04:51 crc kubenswrapper[4766]: I1209 06:04:51.132334 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-h8f7r_351744ca-7f25-47b3-b00e-f7bdaf6f1693/prometheus-operator/0.log" Dec 09 06:04:51 crc kubenswrapper[4766]: I1209 06:04:51.286611 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-684d6797b6-cx2wg_16f6a81d-c16a-43d8-8bd0-d58756bb2605/prometheus-operator-admission-webhook/0.log" Dec 09 06:04:51 crc kubenswrapper[4766]: I1209 06:04:51.320862 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-684d6797b6-mjtpc_4c5e8ca0-1744-4ea2-8e81-dab7828b617a/prometheus-operator-admission-webhook/0.log" Dec 09 06:04:51 crc kubenswrapper[4766]: I1209 06:04:51.479948 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-zrnmc_ca072608-0822-46da-a58f-e6544233b091/operator/0.log" Dec 09 06:04:51 crc kubenswrapper[4766]: I1209 06:04:51.512834 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-9smnr_a8b37c0c-fe68-4f18-a5df-43d3221934a2/perses-operator/0.log" Dec 09 06:05:07 crc kubenswrapper[4766]: I1209 06:05:07.316717 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 06:05:07 crc kubenswrapper[4766]: I1209 06:05:07.317320 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 06:05:07 crc kubenswrapper[4766]: I1209 06:05:07.317377 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 06:05:07 crc kubenswrapper[4766]: I1209 06:05:07.318310 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4797219e0810bddab585efd706eef9cd1de0b04bb6ec690af775b468156933a0"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 06:05:07 crc kubenswrapper[4766]: I1209 06:05:07.318367 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://4797219e0810bddab585efd706eef9cd1de0b04bb6ec690af775b468156933a0" gracePeriod=600 Dec 09 06:05:07 crc kubenswrapper[4766]: I1209 06:05:07.709363 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="4797219e0810bddab585efd706eef9cd1de0b04bb6ec690af775b468156933a0" exitCode=0 Dec 09 06:05:07 crc kubenswrapper[4766]: I1209 06:05:07.709427 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"4797219e0810bddab585efd706eef9cd1de0b04bb6ec690af775b468156933a0"} Dec 09 06:05:07 crc kubenswrapper[4766]: I1209 06:05:07.710132 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerStarted","Data":"9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6"} Dec 09 06:05:07 crc kubenswrapper[4766]: I1209 06:05:07.710172 4766 scope.go:117] "RemoveContainer" containerID="60b1e1bed7fa38bbf8013e80b1528a7878f2250f5550ab2aaac9f0b6af1e1a86" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.335195 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vz64v"] Dec 09 06:06:13 crc kubenswrapper[4766]: E1209 06:06:13.337172 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e655b6-f2b9-4866-9ff9-98f2d500b061" containerName="keystone-cron" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.337276 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e655b6-f2b9-4866-9ff9-98f2d500b061" containerName="keystone-cron" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.340735 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e655b6-f2b9-4866-9ff9-98f2d500b061" containerName="keystone-cron" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.342484 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.364696 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vz64v"] Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.481561 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbqps\" (UniqueName: \"kubernetes.io/projected/6bbcf3cf-c08f-472c-afba-cda33029203e-kube-api-access-rbqps\") pod \"community-operators-vz64v\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.481772 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-utilities\") pod \"community-operators-vz64v\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.481904 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-catalog-content\") pod \"community-operators-vz64v\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.583130 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-catalog-content\") pod \"community-operators-vz64v\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.583189 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbqps\" (UniqueName: \"kubernetes.io/projected/6bbcf3cf-c08f-472c-afba-cda33029203e-kube-api-access-rbqps\") pod \"community-operators-vz64v\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.583299 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-utilities\") pod \"community-operators-vz64v\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.583611 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-catalog-content\") pod \"community-operators-vz64v\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.583743 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-utilities\") pod \"community-operators-vz64v\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.609108 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbqps\" (UniqueName: \"kubernetes.io/projected/6bbcf3cf-c08f-472c-afba-cda33029203e-kube-api-access-rbqps\") pod \"community-operators-vz64v\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:13 crc kubenswrapper[4766]: I1209 06:06:13.662974 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:14 crc kubenswrapper[4766]: I1209 06:06:14.287342 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vz64v"] Dec 09 06:06:14 crc kubenswrapper[4766]: W1209 06:06:14.291269 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bbcf3cf_c08f_472c_afba_cda33029203e.slice/crio-64a2632fb667ae281f46ea4219c887486627fc80f34c6104526fa34d98e3943f WatchSource:0}: Error finding container 64a2632fb667ae281f46ea4219c887486627fc80f34c6104526fa34d98e3943f: Status 404 returned error can't find the container with id 64a2632fb667ae281f46ea4219c887486627fc80f34c6104526fa34d98e3943f Dec 09 06:06:14 crc kubenswrapper[4766]: I1209 06:06:14.455961 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz64v" event={"ID":"6bbcf3cf-c08f-472c-afba-cda33029203e","Type":"ContainerStarted","Data":"64a2632fb667ae281f46ea4219c887486627fc80f34c6104526fa34d98e3943f"} Dec 09 06:06:15 crc kubenswrapper[4766]: I1209 06:06:15.478063 4766 generic.go:334] "Generic (PLEG): container finished" podID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerID="42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5" exitCode=0 Dec 09 06:06:15 crc kubenswrapper[4766]: I1209 06:06:15.478224 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz64v" event={"ID":"6bbcf3cf-c08f-472c-afba-cda33029203e","Type":"ContainerDied","Data":"42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5"} Dec 09 06:06:15 crc kubenswrapper[4766]: I1209 06:06:15.481242 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 06:06:16 crc kubenswrapper[4766]: I1209 06:06:16.490765 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz64v" event={"ID":"6bbcf3cf-c08f-472c-afba-cda33029203e","Type":"ContainerStarted","Data":"cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b"} Dec 09 06:06:17 crc kubenswrapper[4766]: I1209 06:06:17.500946 4766 generic.go:334] "Generic (PLEG): container finished" podID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerID="cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b" exitCode=0 Dec 09 06:06:17 crc kubenswrapper[4766]: I1209 06:06:17.501294 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz64v" event={"ID":"6bbcf3cf-c08f-472c-afba-cda33029203e","Type":"ContainerDied","Data":"cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b"} Dec 09 06:06:18 crc kubenswrapper[4766]: I1209 06:06:18.514182 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz64v" event={"ID":"6bbcf3cf-c08f-472c-afba-cda33029203e","Type":"ContainerStarted","Data":"d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139"} Dec 09 06:06:18 crc kubenswrapper[4766]: I1209 06:06:18.537635 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vz64v" podStartSLOduration=3.094232173 podStartE2EDuration="5.537614223s" podCreationTimestamp="2025-12-09 06:06:13 +0000 UTC" firstStartedPulling="2025-12-09 06:06:15.48101779 +0000 UTC m=+10457.190323216" lastFinishedPulling="2025-12-09 06:06:17.92439984 +0000 UTC m=+10459.633705266" observedRunningTime="2025-12-09 06:06:18.533992918 +0000 UTC m=+10460.243298354" watchObservedRunningTime="2025-12-09 06:06:18.537614223 +0000 UTC m=+10460.246919649" Dec 09 06:06:23 crc kubenswrapper[4766]: I1209 06:06:23.663731 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:23 crc kubenswrapper[4766]: I1209 06:06:23.664619 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:23 crc kubenswrapper[4766]: I1209 06:06:23.740977 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:24 crc kubenswrapper[4766]: I1209 06:06:24.640840 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:24 crc kubenswrapper[4766]: I1209 06:06:24.698791 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vz64v"] Dec 09 06:06:26 crc kubenswrapper[4766]: I1209 06:06:26.608928 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vz64v" podUID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerName="registry-server" containerID="cri-o://d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139" gracePeriod=2 Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.138156 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.309755 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-utilities\") pod \"6bbcf3cf-c08f-472c-afba-cda33029203e\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.309828 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbqps\" (UniqueName: \"kubernetes.io/projected/6bbcf3cf-c08f-472c-afba-cda33029203e-kube-api-access-rbqps\") pod \"6bbcf3cf-c08f-472c-afba-cda33029203e\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.309918 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-catalog-content\") pod \"6bbcf3cf-c08f-472c-afba-cda33029203e\" (UID: \"6bbcf3cf-c08f-472c-afba-cda33029203e\") " Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.310793 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-utilities" (OuterVolumeSpecName: "utilities") pod "6bbcf3cf-c08f-472c-afba-cda33029203e" (UID: "6bbcf3cf-c08f-472c-afba-cda33029203e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.316067 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bbcf3cf-c08f-472c-afba-cda33029203e-kube-api-access-rbqps" (OuterVolumeSpecName: "kube-api-access-rbqps") pod "6bbcf3cf-c08f-472c-afba-cda33029203e" (UID: "6bbcf3cf-c08f-472c-afba-cda33029203e"). InnerVolumeSpecName "kube-api-access-rbqps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.358049 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bbcf3cf-c08f-472c-afba-cda33029203e" (UID: "6bbcf3cf-c08f-472c-afba-cda33029203e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.412284 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.412318 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbqps\" (UniqueName: \"kubernetes.io/projected/6bbcf3cf-c08f-472c-afba-cda33029203e-kube-api-access-rbqps\") on node \"crc\" DevicePath \"\"" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.412329 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bbcf3cf-c08f-472c-afba-cda33029203e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.668912 4766 generic.go:334] "Generic (PLEG): container finished" podID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerID="d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139" exitCode=0 Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.668964 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vz64v" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.668964 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz64v" event={"ID":"6bbcf3cf-c08f-472c-afba-cda33029203e","Type":"ContainerDied","Data":"d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139"} Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.669103 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vz64v" event={"ID":"6bbcf3cf-c08f-472c-afba-cda33029203e","Type":"ContainerDied","Data":"64a2632fb667ae281f46ea4219c887486627fc80f34c6104526fa34d98e3943f"} Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.669138 4766 scope.go:117] "RemoveContainer" containerID="d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.717469 4766 scope.go:117] "RemoveContainer" containerID="cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.725756 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vz64v"] Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.738023 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vz64v"] Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.751163 4766 scope.go:117] "RemoveContainer" containerID="42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.805864 4766 scope.go:117] "RemoveContainer" containerID="d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139" Dec 09 06:06:27 crc kubenswrapper[4766]: E1209 06:06:27.806564 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139\": container with ID starting with d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139 not found: ID does not exist" containerID="d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.806607 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139"} err="failed to get container status \"d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139\": rpc error: code = NotFound desc = could not find container \"d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139\": container with ID starting with d8679565929bfc5163c2015420ad353f1f882aaea73c44d88825731856a29139 not found: ID does not exist" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.806636 4766 scope.go:117] "RemoveContainer" containerID="cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b" Dec 09 06:06:27 crc kubenswrapper[4766]: E1209 06:06:27.807545 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b\": container with ID starting with cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b not found: ID does not exist" containerID="cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.807573 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b"} err="failed to get container status \"cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b\": rpc error: code = NotFound desc = could not find container \"cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b\": container with ID starting with cfa4461406332f50344affbe1141126cd2cf6f3b43c357eff4810fc49f99270b not found: ID does not exist" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.807591 4766 scope.go:117] "RemoveContainer" containerID="42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5" Dec 09 06:06:27 crc kubenswrapper[4766]: E1209 06:06:27.808665 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5\": container with ID starting with 42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5 not found: ID does not exist" containerID="42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5" Dec 09 06:06:27 crc kubenswrapper[4766]: I1209 06:06:27.808690 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5"} err="failed to get container status \"42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5\": rpc error: code = NotFound desc = could not find container \"42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5\": container with ID starting with 42fe24c11ffc1d6aa12aedcef35fea04b0239c0a6f9257254cefd4176b615fc5 not found: ID does not exist" Dec 09 06:06:28 crc kubenswrapper[4766]: I1209 06:06:28.865587 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bbcf3cf-c08f-472c-afba-cda33029203e" path="/var/lib/kubelet/pods/6bbcf3cf-c08f-472c-afba-cda33029203e/volumes" Dec 09 06:07:02 crc kubenswrapper[4766]: I1209 06:07:02.046792 4766 generic.go:334] "Generic (PLEG): container finished" podID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" containerID="5eb40f6f50a2b56a89e52b841904030db4f72e34c0a2c482140ba3d8c013d6b3" exitCode=0 Dec 09 06:07:02 crc kubenswrapper[4766]: I1209 06:07:02.048614 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-47jdn/must-gather-s8k2m" event={"ID":"b04d667e-4be1-4832-b9f0-bf6942ff1c7e","Type":"ContainerDied","Data":"5eb40f6f50a2b56a89e52b841904030db4f72e34c0a2c482140ba3d8c013d6b3"} Dec 09 06:07:02 crc kubenswrapper[4766]: I1209 06:07:02.049578 4766 scope.go:117] "RemoveContainer" containerID="5eb40f6f50a2b56a89e52b841904030db4f72e34c0a2c482140ba3d8c013d6b3" Dec 09 06:07:02 crc kubenswrapper[4766]: I1209 06:07:02.133267 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-47jdn_must-gather-s8k2m_b04d667e-4be1-4832-b9f0-bf6942ff1c7e/gather/0.log" Dec 09 06:07:07 crc kubenswrapper[4766]: I1209 06:07:07.316678 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 06:07:07 crc kubenswrapper[4766]: I1209 06:07:07.317189 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 06:07:09 crc kubenswrapper[4766]: I1209 06:07:09.864185 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-47jdn/must-gather-s8k2m"] Dec 09 06:07:09 crc kubenswrapper[4766]: I1209 06:07:09.864762 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-47jdn/must-gather-s8k2m" podUID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" containerName="copy" containerID="cri-o://75395de82e0940bd7d099e90c1f737824260bbd0ff3437841632eb95fd3045e1" gracePeriod=2 Dec 09 06:07:09 crc kubenswrapper[4766]: I1209 06:07:09.876492 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-47jdn/must-gather-s8k2m"] Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.154374 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-47jdn_must-gather-s8k2m_b04d667e-4be1-4832-b9f0-bf6942ff1c7e/copy/0.log" Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.155072 4766 generic.go:334] "Generic (PLEG): container finished" podID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" containerID="75395de82e0940bd7d099e90c1f737824260bbd0ff3437841632eb95fd3045e1" exitCode=143 Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.472261 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-47jdn_must-gather-s8k2m_b04d667e-4be1-4832-b9f0-bf6942ff1c7e/copy/0.log" Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.473180 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.507169 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-must-gather-output\") pod \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\" (UID: \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\") " Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.507672 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm5qp\" (UniqueName: \"kubernetes.io/projected/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-kube-api-access-bm5qp\") pod \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\" (UID: \"b04d667e-4be1-4832-b9f0-bf6942ff1c7e\") " Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.516454 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-kube-api-access-bm5qp" (OuterVolumeSpecName: "kube-api-access-bm5qp") pod "b04d667e-4be1-4832-b9f0-bf6942ff1c7e" (UID: "b04d667e-4be1-4832-b9f0-bf6942ff1c7e"). InnerVolumeSpecName "kube-api-access-bm5qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.611813 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm5qp\" (UniqueName: \"kubernetes.io/projected/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-kube-api-access-bm5qp\") on node \"crc\" DevicePath \"\"" Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.745243 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b04d667e-4be1-4832-b9f0-bf6942ff1c7e" (UID: "b04d667e-4be1-4832-b9f0-bf6942ff1c7e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.817526 4766 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b04d667e-4be1-4832-b9f0-bf6942ff1c7e-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 09 06:07:10 crc kubenswrapper[4766]: I1209 06:07:10.853869 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" path="/var/lib/kubelet/pods/b04d667e-4be1-4832-b9f0-bf6942ff1c7e/volumes" Dec 09 06:07:11 crc kubenswrapper[4766]: I1209 06:07:11.166161 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-47jdn_must-gather-s8k2m_b04d667e-4be1-4832-b9f0-bf6942ff1c7e/copy/0.log" Dec 09 06:07:11 crc kubenswrapper[4766]: I1209 06:07:11.167276 4766 scope.go:117] "RemoveContainer" containerID="75395de82e0940bd7d099e90c1f737824260bbd0ff3437841632eb95fd3045e1" Dec 09 06:07:11 crc kubenswrapper[4766]: I1209 06:07:11.167347 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-47jdn/must-gather-s8k2m" Dec 09 06:07:11 crc kubenswrapper[4766]: I1209 06:07:11.190419 4766 scope.go:117] "RemoveContainer" containerID="5eb40f6f50a2b56a89e52b841904030db4f72e34c0a2c482140ba3d8c013d6b3" Dec 09 06:07:37 crc kubenswrapper[4766]: I1209 06:07:37.347302 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 06:07:37 crc kubenswrapper[4766]: I1209 06:07:37.347841 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 06:08:07 crc kubenswrapper[4766]: I1209 06:08:07.316614 4766 patch_prober.go:28] interesting pod/machine-config-daemon-db9hx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 06:08:07 crc kubenswrapper[4766]: I1209 06:08:07.317274 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 06:08:07 crc kubenswrapper[4766]: I1209 06:08:07.317335 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" Dec 09 06:08:07 crc kubenswrapper[4766]: I1209 06:08:07.318289 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6"} pod="openshift-machine-config-operator/machine-config-daemon-db9hx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 06:08:07 crc kubenswrapper[4766]: I1209 06:08:07.318348 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerName="machine-config-daemon" containerID="cri-o://9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" gracePeriod=600 Dec 09 06:08:07 crc kubenswrapper[4766]: E1209 06:08:07.447065 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:08:07 crc kubenswrapper[4766]: I1209 06:08:07.795036 4766 generic.go:334] "Generic (PLEG): container finished" podID="a42b369b-e4ad-447c-b9b1-5c2461116838" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" exitCode=0 Dec 09 06:08:07 crc kubenswrapper[4766]: I1209 06:08:07.795093 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" event={"ID":"a42b369b-e4ad-447c-b9b1-5c2461116838","Type":"ContainerDied","Data":"9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6"} Dec 09 06:08:07 crc kubenswrapper[4766]: I1209 06:08:07.795183 4766 scope.go:117] "RemoveContainer" containerID="4797219e0810bddab585efd706eef9cd1de0b04bb6ec690af775b468156933a0" Dec 09 06:08:07 crc kubenswrapper[4766]: I1209 06:08:07.795923 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:08:07 crc kubenswrapper[4766]: E1209 06:08:07.796224 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.418153 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dgq69"] Dec 09 06:08:17 crc kubenswrapper[4766]: E1209 06:08:17.419108 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerName="extract-utilities" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.419122 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerName="extract-utilities" Dec 09 06:08:17 crc kubenswrapper[4766]: E1209 06:08:17.419133 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerName="registry-server" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.419139 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerName="registry-server" Dec 09 06:08:17 crc kubenswrapper[4766]: E1209 06:08:17.419164 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" containerName="gather" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.419170 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" containerName="gather" Dec 09 06:08:17 crc kubenswrapper[4766]: E1209 06:08:17.419184 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerName="extract-content" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.419189 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerName="extract-content" Dec 09 06:08:17 crc kubenswrapper[4766]: E1209 06:08:17.419201 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" containerName="copy" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.419207 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" containerName="copy" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.419409 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" containerName="copy" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.419429 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bbcf3cf-c08f-472c-afba-cda33029203e" containerName="registry-server" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.419452 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04d667e-4be1-4832-b9f0-bf6942ff1c7e" containerName="gather" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.421016 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.440954 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgq69"] Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.449242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flh7d\" (UniqueName: \"kubernetes.io/projected/1316bfec-5c26-4120-b666-9eb2701e2e3b-kube-api-access-flh7d\") pod \"redhat-marketplace-dgq69\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.449358 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-catalog-content\") pod \"redhat-marketplace-dgq69\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.449518 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-utilities\") pod \"redhat-marketplace-dgq69\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.551763 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flh7d\" (UniqueName: \"kubernetes.io/projected/1316bfec-5c26-4120-b666-9eb2701e2e3b-kube-api-access-flh7d\") pod \"redhat-marketplace-dgq69\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.551838 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-catalog-content\") pod \"redhat-marketplace-dgq69\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.551901 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-utilities\") pod \"redhat-marketplace-dgq69\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.552562 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-utilities\") pod \"redhat-marketplace-dgq69\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.552657 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-catalog-content\") pod \"redhat-marketplace-dgq69\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.580467 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flh7d\" (UniqueName: \"kubernetes.io/projected/1316bfec-5c26-4120-b666-9eb2701e2e3b-kube-api-access-flh7d\") pod \"redhat-marketplace-dgq69\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:17 crc kubenswrapper[4766]: I1209 06:08:17.746066 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:18 crc kubenswrapper[4766]: I1209 06:08:18.265107 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgq69"] Dec 09 06:08:18 crc kubenswrapper[4766]: I1209 06:08:18.951961 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgq69" event={"ID":"1316bfec-5c26-4120-b666-9eb2701e2e3b","Type":"ContainerDied","Data":"f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9"} Dec 09 06:08:18 crc kubenswrapper[4766]: I1209 06:08:18.954319 4766 generic.go:334] "Generic (PLEG): container finished" podID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerID="f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9" exitCode=0 Dec 09 06:08:18 crc kubenswrapper[4766]: I1209 06:08:18.954419 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgq69" event={"ID":"1316bfec-5c26-4120-b666-9eb2701e2e3b","Type":"ContainerStarted","Data":"3215df6b71013c57270a31c6ef84f462e4769bcfce355c5a193ce8b2043cacb5"} Dec 09 06:08:19 crc kubenswrapper[4766]: I1209 06:08:19.965557 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgq69" event={"ID":"1316bfec-5c26-4120-b666-9eb2701e2e3b","Type":"ContainerStarted","Data":"cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25"} Dec 09 06:08:20 crc kubenswrapper[4766]: I1209 06:08:20.977609 4766 generic.go:334] "Generic (PLEG): container finished" podID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerID="cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25" exitCode=0 Dec 09 06:08:20 crc kubenswrapper[4766]: I1209 06:08:20.977721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgq69" event={"ID":"1316bfec-5c26-4120-b666-9eb2701e2e3b","Type":"ContainerDied","Data":"cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25"} Dec 09 06:08:21 crc kubenswrapper[4766]: I1209 06:08:21.839799 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:08:21 crc kubenswrapper[4766]: E1209 06:08:21.840368 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:08:21 crc kubenswrapper[4766]: I1209 06:08:21.990942 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgq69" event={"ID":"1316bfec-5c26-4120-b666-9eb2701e2e3b","Type":"ContainerStarted","Data":"aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de"} Dec 09 06:08:22 crc kubenswrapper[4766]: I1209 06:08:22.023181 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dgq69" podStartSLOduration=2.515006692 podStartE2EDuration="5.023151315s" podCreationTimestamp="2025-12-09 06:08:17 +0000 UTC" firstStartedPulling="2025-12-09 06:08:18.955172914 +0000 UTC m=+10580.664478380" lastFinishedPulling="2025-12-09 06:08:21.463317577 +0000 UTC m=+10583.172623003" observedRunningTime="2025-12-09 06:08:22.014032572 +0000 UTC m=+10583.723338008" watchObservedRunningTime="2025-12-09 06:08:22.023151315 +0000 UTC m=+10583.732456781" Dec 09 06:08:27 crc kubenswrapper[4766]: I1209 06:08:27.746977 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:27 crc kubenswrapper[4766]: I1209 06:08:27.747476 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:27 crc kubenswrapper[4766]: I1209 06:08:27.799178 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:28 crc kubenswrapper[4766]: I1209 06:08:28.154336 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:28 crc kubenswrapper[4766]: I1209 06:08:28.212709 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgq69"] Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.096361 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dgq69" podUID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerName="registry-server" containerID="cri-o://aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de" gracePeriod=2 Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.612523 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.653791 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-catalog-content\") pod \"1316bfec-5c26-4120-b666-9eb2701e2e3b\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.654020 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flh7d\" (UniqueName: \"kubernetes.io/projected/1316bfec-5c26-4120-b666-9eb2701e2e3b-kube-api-access-flh7d\") pod \"1316bfec-5c26-4120-b666-9eb2701e2e3b\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.654107 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-utilities\") pod \"1316bfec-5c26-4120-b666-9eb2701e2e3b\" (UID: \"1316bfec-5c26-4120-b666-9eb2701e2e3b\") " Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.654994 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-utilities" (OuterVolumeSpecName: "utilities") pod "1316bfec-5c26-4120-b666-9eb2701e2e3b" (UID: "1316bfec-5c26-4120-b666-9eb2701e2e3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.660125 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1316bfec-5c26-4120-b666-9eb2701e2e3b-kube-api-access-flh7d" (OuterVolumeSpecName: "kube-api-access-flh7d") pod "1316bfec-5c26-4120-b666-9eb2701e2e3b" (UID: "1316bfec-5c26-4120-b666-9eb2701e2e3b"). InnerVolumeSpecName "kube-api-access-flh7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.694486 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1316bfec-5c26-4120-b666-9eb2701e2e3b" (UID: "1316bfec-5c26-4120-b666-9eb2701e2e3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.756831 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flh7d\" (UniqueName: \"kubernetes.io/projected/1316bfec-5c26-4120-b666-9eb2701e2e3b-kube-api-access-flh7d\") on node \"crc\" DevicePath \"\"" Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.756870 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 06:08:30 crc kubenswrapper[4766]: I1209 06:08:30.756880 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1316bfec-5c26-4120-b666-9eb2701e2e3b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.107981 4766 generic.go:334] "Generic (PLEG): container finished" podID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerID="aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de" exitCode=0 Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.108019 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgq69" event={"ID":"1316bfec-5c26-4120-b666-9eb2701e2e3b","Type":"ContainerDied","Data":"aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de"} Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.108052 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgq69" event={"ID":"1316bfec-5c26-4120-b666-9eb2701e2e3b","Type":"ContainerDied","Data":"3215df6b71013c57270a31c6ef84f462e4769bcfce355c5a193ce8b2043cacb5"} Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.108062 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgq69" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.108070 4766 scope.go:117] "RemoveContainer" containerID="aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.131805 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgq69"] Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.137453 4766 scope.go:117] "RemoveContainer" containerID="cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.143159 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgq69"] Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.161400 4766 scope.go:117] "RemoveContainer" containerID="f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.198683 4766 scope.go:117] "RemoveContainer" containerID="aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de" Dec 09 06:08:31 crc kubenswrapper[4766]: E1209 06:08:31.199111 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de\": container with ID starting with aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de not found: ID does not exist" containerID="aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.199152 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de"} err="failed to get container status \"aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de\": rpc error: code = NotFound desc = could not find container \"aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de\": container with ID starting with aebdf3940916fbf3b93480ee18675fd89f8685465fe672542a74ba78b295e1de not found: ID does not exist" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.199191 4766 scope.go:117] "RemoveContainer" containerID="cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25" Dec 09 06:08:31 crc kubenswrapper[4766]: E1209 06:08:31.199636 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25\": container with ID starting with cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25 not found: ID does not exist" containerID="cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.199667 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25"} err="failed to get container status \"cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25\": rpc error: code = NotFound desc = could not find container \"cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25\": container with ID starting with cbb5e684dd237b3f597c046335e7acc331a266027c810c675a6074ec8e835c25 not found: ID does not exist" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.199684 4766 scope.go:117] "RemoveContainer" containerID="f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9" Dec 09 06:08:31 crc kubenswrapper[4766]: E1209 06:08:31.199978 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9\": container with ID starting with f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9 not found: ID does not exist" containerID="f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9" Dec 09 06:08:31 crc kubenswrapper[4766]: I1209 06:08:31.199999 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9"} err="failed to get container status \"f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9\": rpc error: code = NotFound desc = could not find container \"f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9\": container with ID starting with f0ce34b525ae50bb9a9abd05ebcfa8787ed69cbb280186adfd2dfe5d3b8277f9 not found: ID does not exist" Dec 09 06:08:32 crc kubenswrapper[4766]: I1209 06:08:32.868615 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1316bfec-5c26-4120-b666-9eb2701e2e3b" path="/var/lib/kubelet/pods/1316bfec-5c26-4120-b666-9eb2701e2e3b/volumes" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.804916 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wqnsj"] Dec 09 06:08:33 crc kubenswrapper[4766]: E1209 06:08:33.806344 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerName="extract-utilities" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.806453 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerName="extract-utilities" Dec 09 06:08:33 crc kubenswrapper[4766]: E1209 06:08:33.806607 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerName="registry-server" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.806711 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerName="registry-server" Dec 09 06:08:33 crc kubenswrapper[4766]: E1209 06:08:33.806816 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerName="extract-content" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.806915 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerName="extract-content" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.807334 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1316bfec-5c26-4120-b666-9eb2701e2e3b" containerName="registry-server" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.809529 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.817557 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-catalog-content\") pod \"redhat-operators-wqnsj\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.817908 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-utilities\") pod \"redhat-operators-wqnsj\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.818116 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8szz8\" (UniqueName: \"kubernetes.io/projected/a94ddda8-3e0e-430a-aa63-3f30bf350a25-kube-api-access-8szz8\") pod \"redhat-operators-wqnsj\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.843604 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wqnsj"] Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.919563 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-catalog-content\") pod \"redhat-operators-wqnsj\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.919674 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-utilities\") pod \"redhat-operators-wqnsj\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.919848 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8szz8\" (UniqueName: \"kubernetes.io/projected/a94ddda8-3e0e-430a-aa63-3f30bf350a25-kube-api-access-8szz8\") pod \"redhat-operators-wqnsj\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.920168 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-catalog-content\") pod \"redhat-operators-wqnsj\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.920601 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-utilities\") pod \"redhat-operators-wqnsj\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:33 crc kubenswrapper[4766]: I1209 06:08:33.948028 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8szz8\" (UniqueName: \"kubernetes.io/projected/a94ddda8-3e0e-430a-aa63-3f30bf350a25-kube-api-access-8szz8\") pod \"redhat-operators-wqnsj\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:34 crc kubenswrapper[4766]: I1209 06:08:34.132259 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:34 crc kubenswrapper[4766]: I1209 06:08:34.610152 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wqnsj"] Dec 09 06:08:35 crc kubenswrapper[4766]: I1209 06:08:35.151179 4766 generic.go:334] "Generic (PLEG): container finished" podID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerID="26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a" exitCode=0 Dec 09 06:08:35 crc kubenswrapper[4766]: I1209 06:08:35.151360 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnsj" event={"ID":"a94ddda8-3e0e-430a-aa63-3f30bf350a25","Type":"ContainerDied","Data":"26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a"} Dec 09 06:08:35 crc kubenswrapper[4766]: I1209 06:08:35.151438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnsj" event={"ID":"a94ddda8-3e0e-430a-aa63-3f30bf350a25","Type":"ContainerStarted","Data":"c03c5ca254c66e1accd799e7214f40e04b6037ce81e23843e9eb61b67a5937cd"} Dec 09 06:08:36 crc kubenswrapper[4766]: I1209 06:08:36.841121 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:08:36 crc kubenswrapper[4766]: E1209 06:08:36.842126 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:08:37 crc kubenswrapper[4766]: I1209 06:08:37.291079 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnsj" event={"ID":"a94ddda8-3e0e-430a-aa63-3f30bf350a25","Type":"ContainerStarted","Data":"f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805"} Dec 09 06:08:39 crc kubenswrapper[4766]: I1209 06:08:39.313194 4766 generic.go:334] "Generic (PLEG): container finished" podID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerID="f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805" exitCode=0 Dec 09 06:08:39 crc kubenswrapper[4766]: I1209 06:08:39.313286 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnsj" event={"ID":"a94ddda8-3e0e-430a-aa63-3f30bf350a25","Type":"ContainerDied","Data":"f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805"} Dec 09 06:08:40 crc kubenswrapper[4766]: I1209 06:08:40.324788 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnsj" event={"ID":"a94ddda8-3e0e-430a-aa63-3f30bf350a25","Type":"ContainerStarted","Data":"2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef"} Dec 09 06:08:40 crc kubenswrapper[4766]: I1209 06:08:40.355039 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wqnsj" podStartSLOduration=2.757163105 podStartE2EDuration="7.355017763s" podCreationTimestamp="2025-12-09 06:08:33 +0000 UTC" firstStartedPulling="2025-12-09 06:08:35.153093545 +0000 UTC m=+10596.862398971" lastFinishedPulling="2025-12-09 06:08:39.750948203 +0000 UTC m=+10601.460253629" observedRunningTime="2025-12-09 06:08:40.342679077 +0000 UTC m=+10602.051984523" watchObservedRunningTime="2025-12-09 06:08:40.355017763 +0000 UTC m=+10602.064323189" Dec 09 06:08:44 crc kubenswrapper[4766]: I1209 06:08:44.132860 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:44 crc kubenswrapper[4766]: I1209 06:08:44.133469 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:45 crc kubenswrapper[4766]: I1209 06:08:45.194190 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wqnsj" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerName="registry-server" probeResult="failure" output=< Dec 09 06:08:45 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Dec 09 06:08:45 crc kubenswrapper[4766]: > Dec 09 06:08:47 crc kubenswrapper[4766]: I1209 06:08:47.839503 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:08:47 crc kubenswrapper[4766]: E1209 06:08:47.840130 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:08:54 crc kubenswrapper[4766]: I1209 06:08:54.183759 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:54 crc kubenswrapper[4766]: I1209 06:08:54.251507 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:54 crc kubenswrapper[4766]: I1209 06:08:54.422837 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wqnsj"] Dec 09 06:08:55 crc kubenswrapper[4766]: I1209 06:08:55.496325 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wqnsj" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerName="registry-server" containerID="cri-o://2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef" gracePeriod=2 Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.032336 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.225092 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-utilities\") pod \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.225614 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-catalog-content\") pod \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.225807 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8szz8\" (UniqueName: \"kubernetes.io/projected/a94ddda8-3e0e-430a-aa63-3f30bf350a25-kube-api-access-8szz8\") pod \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\" (UID: \"a94ddda8-3e0e-430a-aa63-3f30bf350a25\") " Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.226616 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-utilities" (OuterVolumeSpecName: "utilities") pod "a94ddda8-3e0e-430a-aa63-3f30bf350a25" (UID: "a94ddda8-3e0e-430a-aa63-3f30bf350a25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.234506 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94ddda8-3e0e-430a-aa63-3f30bf350a25-kube-api-access-8szz8" (OuterVolumeSpecName: "kube-api-access-8szz8") pod "a94ddda8-3e0e-430a-aa63-3f30bf350a25" (UID: "a94ddda8-3e0e-430a-aa63-3f30bf350a25"). InnerVolumeSpecName "kube-api-access-8szz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.329478 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8szz8\" (UniqueName: \"kubernetes.io/projected/a94ddda8-3e0e-430a-aa63-3f30bf350a25-kube-api-access-8szz8\") on node \"crc\" DevicePath \"\"" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.329536 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.339426 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a94ddda8-3e0e-430a-aa63-3f30bf350a25" (UID: "a94ddda8-3e0e-430a-aa63-3f30bf350a25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.431546 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94ddda8-3e0e-430a-aa63-3f30bf350a25-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.509560 4766 generic.go:334] "Generic (PLEG): container finished" podID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerID="2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef" exitCode=0 Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.509621 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnsj" event={"ID":"a94ddda8-3e0e-430a-aa63-3f30bf350a25","Type":"ContainerDied","Data":"2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef"} Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.509633 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wqnsj" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.509667 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wqnsj" event={"ID":"a94ddda8-3e0e-430a-aa63-3f30bf350a25","Type":"ContainerDied","Data":"c03c5ca254c66e1accd799e7214f40e04b6037ce81e23843e9eb61b67a5937cd"} Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.509694 4766 scope.go:117] "RemoveContainer" containerID="2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.532414 4766 scope.go:117] "RemoveContainer" containerID="f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.555894 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wqnsj"] Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.568638 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wqnsj"] Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.574451 4766 scope.go:117] "RemoveContainer" containerID="26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.638167 4766 scope.go:117] "RemoveContainer" containerID="2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef" Dec 09 06:08:56 crc kubenswrapper[4766]: E1209 06:08:56.638905 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef\": container with ID starting with 2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef not found: ID does not exist" containerID="2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.638949 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef"} err="failed to get container status \"2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef\": rpc error: code = NotFound desc = could not find container \"2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef\": container with ID starting with 2f7156f1cc40771709d0d832a89c5d7a55a099c770d9c8f4667989ff97b2abef not found: ID does not exist" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.638978 4766 scope.go:117] "RemoveContainer" containerID="f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805" Dec 09 06:08:56 crc kubenswrapper[4766]: E1209 06:08:56.640064 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805\": container with ID starting with f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805 not found: ID does not exist" containerID="f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.640117 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805"} err="failed to get container status \"f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805\": rpc error: code = NotFound desc = could not find container \"f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805\": container with ID starting with f03133e0cc3a744403fbb322557e4c172873d9a5c7a4907250f9d64e1eee4805 not found: ID does not exist" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.640151 4766 scope.go:117] "RemoveContainer" containerID="26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a" Dec 09 06:08:56 crc kubenswrapper[4766]: E1209 06:08:56.640807 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a\": container with ID starting with 26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a not found: ID does not exist" containerID="26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.640834 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a"} err="failed to get container status \"26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a\": rpc error: code = NotFound desc = could not find container \"26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a\": container with ID starting with 26aa174d8cfdacba3a123eee1132916d926a1a243b8bfb7d15fa814ab6b1c86a not found: ID does not exist" Dec 09 06:08:56 crc kubenswrapper[4766]: I1209 06:08:56.858371 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" path="/var/lib/kubelet/pods/a94ddda8-3e0e-430a-aa63-3f30bf350a25/volumes" Dec 09 06:08:59 crc kubenswrapper[4766]: I1209 06:08:59.839273 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:08:59 crc kubenswrapper[4766]: E1209 06:08:59.840083 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:09:13 crc kubenswrapper[4766]: I1209 06:09:13.840543 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:09:13 crc kubenswrapper[4766]: E1209 06:09:13.841399 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:09:27 crc kubenswrapper[4766]: I1209 06:09:27.839538 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:09:27 crc kubenswrapper[4766]: E1209 06:09:27.841931 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.114180 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vkvsg"] Dec 09 06:09:30 crc kubenswrapper[4766]: E1209 06:09:30.115114 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerName="registry-server" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.115149 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerName="registry-server" Dec 09 06:09:30 crc kubenswrapper[4766]: E1209 06:09:30.115180 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerName="extract-utilities" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.115188 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerName="extract-utilities" Dec 09 06:09:30 crc kubenswrapper[4766]: E1209 06:09:30.115242 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerName="extract-content" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.115251 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerName="extract-content" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.115520 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94ddda8-3e0e-430a-aa63-3f30bf350a25" containerName="registry-server" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.117580 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.127388 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vkvsg"] Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.295637 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlksh\" (UniqueName: \"kubernetes.io/projected/9d909277-902a-48ca-95a5-b316207ad08f-kube-api-access-wlksh\") pod \"certified-operators-vkvsg\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.296352 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-utilities\") pod \"certified-operators-vkvsg\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.296631 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-catalog-content\") pod \"certified-operators-vkvsg\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.398354 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-utilities\") pod \"certified-operators-vkvsg\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.398816 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-catalog-content\") pod \"certified-operators-vkvsg\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.398969 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlksh\" (UniqueName: \"kubernetes.io/projected/9d909277-902a-48ca-95a5-b316207ad08f-kube-api-access-wlksh\") pod \"certified-operators-vkvsg\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.399743 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-utilities\") pod \"certified-operators-vkvsg\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.400061 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-catalog-content\") pod \"certified-operators-vkvsg\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.420769 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlksh\" (UniqueName: \"kubernetes.io/projected/9d909277-902a-48ca-95a5-b316207ad08f-kube-api-access-wlksh\") pod \"certified-operators-vkvsg\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:30 crc kubenswrapper[4766]: I1209 06:09:30.452864 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:31 crc kubenswrapper[4766]: I1209 06:09:31.012564 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vkvsg"] Dec 09 06:09:31 crc kubenswrapper[4766]: I1209 06:09:31.894866 4766 generic.go:334] "Generic (PLEG): container finished" podID="9d909277-902a-48ca-95a5-b316207ad08f" containerID="dc1752512fa96231f5e0cefcc473b5706995afe8d0915bb7800d0d2621e05958" exitCode=0 Dec 09 06:09:31 crc kubenswrapper[4766]: I1209 06:09:31.895020 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vkvsg" event={"ID":"9d909277-902a-48ca-95a5-b316207ad08f","Type":"ContainerDied","Data":"dc1752512fa96231f5e0cefcc473b5706995afe8d0915bb7800d0d2621e05958"} Dec 09 06:09:31 crc kubenswrapper[4766]: I1209 06:09:31.895451 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vkvsg" event={"ID":"9d909277-902a-48ca-95a5-b316207ad08f","Type":"ContainerStarted","Data":"a3c93b341cbd9162fca8759b87bcea003d9a1f43f93d712334fa496e5f755dd8"} Dec 09 06:09:32 crc kubenswrapper[4766]: I1209 06:09:32.908452 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vkvsg" event={"ID":"9d909277-902a-48ca-95a5-b316207ad08f","Type":"ContainerStarted","Data":"6f7700cc93c6803210989cfc5c3c576489350352730f115ea9200836df4aa45b"} Dec 09 06:09:33 crc kubenswrapper[4766]: I1209 06:09:33.946118 4766 generic.go:334] "Generic (PLEG): container finished" podID="9d909277-902a-48ca-95a5-b316207ad08f" containerID="6f7700cc93c6803210989cfc5c3c576489350352730f115ea9200836df4aa45b" exitCode=0 Dec 09 06:09:33 crc kubenswrapper[4766]: I1209 06:09:33.946434 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vkvsg" event={"ID":"9d909277-902a-48ca-95a5-b316207ad08f","Type":"ContainerDied","Data":"6f7700cc93c6803210989cfc5c3c576489350352730f115ea9200836df4aa45b"} Dec 09 06:09:34 crc kubenswrapper[4766]: I1209 06:09:34.957841 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vkvsg" event={"ID":"9d909277-902a-48ca-95a5-b316207ad08f","Type":"ContainerStarted","Data":"20f430c98e82751e9d2a91f1abe1557f2eb7c630f43599c499a195bf3a98ef19"} Dec 09 06:09:34 crc kubenswrapper[4766]: I1209 06:09:34.979846 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vkvsg" podStartSLOduration=2.263180655 podStartE2EDuration="4.979821387s" podCreationTimestamp="2025-12-09 06:09:30 +0000 UTC" firstStartedPulling="2025-12-09 06:09:31.897055321 +0000 UTC m=+10653.606360747" lastFinishedPulling="2025-12-09 06:09:34.613696043 +0000 UTC m=+10656.323001479" observedRunningTime="2025-12-09 06:09:34.972189988 +0000 UTC m=+10656.681495414" watchObservedRunningTime="2025-12-09 06:09:34.979821387 +0000 UTC m=+10656.689126813" Dec 09 06:09:38 crc kubenswrapper[4766]: I1209 06:09:38.846630 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:09:38 crc kubenswrapper[4766]: E1209 06:09:38.847576 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:09:40 crc kubenswrapper[4766]: I1209 06:09:40.453600 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:40 crc kubenswrapper[4766]: I1209 06:09:40.453916 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:40 crc kubenswrapper[4766]: I1209 06:09:40.519596 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:41 crc kubenswrapper[4766]: I1209 06:09:41.076319 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:41 crc kubenswrapper[4766]: I1209 06:09:41.137358 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vkvsg"] Dec 09 06:09:43 crc kubenswrapper[4766]: I1209 06:09:43.043531 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vkvsg" podUID="9d909277-902a-48ca-95a5-b316207ad08f" containerName="registry-server" containerID="cri-o://20f430c98e82751e9d2a91f1abe1557f2eb7c630f43599c499a195bf3a98ef19" gracePeriod=2 Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.057904 4766 generic.go:334] "Generic (PLEG): container finished" podID="9d909277-902a-48ca-95a5-b316207ad08f" containerID="20f430c98e82751e9d2a91f1abe1557f2eb7c630f43599c499a195bf3a98ef19" exitCode=0 Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.058004 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vkvsg" event={"ID":"9d909277-902a-48ca-95a5-b316207ad08f","Type":"ContainerDied","Data":"20f430c98e82751e9d2a91f1abe1557f2eb7c630f43599c499a195bf3a98ef19"} Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.684517 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.874350 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlksh\" (UniqueName: \"kubernetes.io/projected/9d909277-902a-48ca-95a5-b316207ad08f-kube-api-access-wlksh\") pod \"9d909277-902a-48ca-95a5-b316207ad08f\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.874569 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-catalog-content\") pod \"9d909277-902a-48ca-95a5-b316207ad08f\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.874782 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-utilities\") pod \"9d909277-902a-48ca-95a5-b316207ad08f\" (UID: \"9d909277-902a-48ca-95a5-b316207ad08f\") " Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.876364 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-utilities" (OuterVolumeSpecName: "utilities") pod "9d909277-902a-48ca-95a5-b316207ad08f" (UID: "9d909277-902a-48ca-95a5-b316207ad08f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.899739 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d909277-902a-48ca-95a5-b316207ad08f-kube-api-access-wlksh" (OuterVolumeSpecName: "kube-api-access-wlksh") pod "9d909277-902a-48ca-95a5-b316207ad08f" (UID: "9d909277-902a-48ca-95a5-b316207ad08f"). InnerVolumeSpecName "kube-api-access-wlksh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.936366 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d909277-902a-48ca-95a5-b316207ad08f" (UID: "9d909277-902a-48ca-95a5-b316207ad08f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.978561 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.978605 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlksh\" (UniqueName: \"kubernetes.io/projected/9d909277-902a-48ca-95a5-b316207ad08f-kube-api-access-wlksh\") on node \"crc\" DevicePath \"\"" Dec 09 06:09:44 crc kubenswrapper[4766]: I1209 06:09:44.978619 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d909277-902a-48ca-95a5-b316207ad08f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 06:09:45 crc kubenswrapper[4766]: I1209 06:09:45.071446 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vkvsg" event={"ID":"9d909277-902a-48ca-95a5-b316207ad08f","Type":"ContainerDied","Data":"a3c93b341cbd9162fca8759b87bcea003d9a1f43f93d712334fa496e5f755dd8"} Dec 09 06:09:45 crc kubenswrapper[4766]: I1209 06:09:45.071506 4766 scope.go:117] "RemoveContainer" containerID="20f430c98e82751e9d2a91f1abe1557f2eb7c630f43599c499a195bf3a98ef19" Dec 09 06:09:45 crc kubenswrapper[4766]: I1209 06:09:45.071534 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vkvsg" Dec 09 06:09:45 crc kubenswrapper[4766]: I1209 06:09:45.109867 4766 scope.go:117] "RemoveContainer" containerID="6f7700cc93c6803210989cfc5c3c576489350352730f115ea9200836df4aa45b" Dec 09 06:09:45 crc kubenswrapper[4766]: I1209 06:09:45.115768 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vkvsg"] Dec 09 06:09:45 crc kubenswrapper[4766]: I1209 06:09:45.131426 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vkvsg"] Dec 09 06:09:45 crc kubenswrapper[4766]: I1209 06:09:45.140788 4766 scope.go:117] "RemoveContainer" containerID="dc1752512fa96231f5e0cefcc473b5706995afe8d0915bb7800d0d2621e05958" Dec 09 06:09:46 crc kubenswrapper[4766]: I1209 06:09:46.858874 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d909277-902a-48ca-95a5-b316207ad08f" path="/var/lib/kubelet/pods/9d909277-902a-48ca-95a5-b316207ad08f/volumes" Dec 09 06:09:50 crc kubenswrapper[4766]: I1209 06:09:50.840349 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:09:50 crc kubenswrapper[4766]: E1209 06:09:50.841359 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:10:01 crc kubenswrapper[4766]: I1209 06:10:01.839351 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:10:01 crc kubenswrapper[4766]: E1209 06:10:01.840083 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:10:16 crc kubenswrapper[4766]: I1209 06:10:16.841362 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:10:16 crc kubenswrapper[4766]: E1209 06:10:16.842150 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:10:28 crc kubenswrapper[4766]: I1209 06:10:28.846614 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:10:28 crc kubenswrapper[4766]: E1209 06:10:28.847397 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838" Dec 09 06:10:40 crc kubenswrapper[4766]: I1209 06:10:40.839680 4766 scope.go:117] "RemoveContainer" containerID="9b69a717897897172418f4a770ffe899981621095f6748b0110b9b329144dde6" Dec 09 06:10:40 crc kubenswrapper[4766]: E1209 06:10:40.840326 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-db9hx_openshift-machine-config-operator(a42b369b-e4ad-447c-b9b1-5c2461116838)\"" pod="openshift-machine-config-operator/machine-config-daemon-db9hx" podUID="a42b369b-e4ad-447c-b9b1-5c2461116838"